ansible-playbook [core 2.17.14] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-5Qw executable location = /usr/local/bin/ansible-playbook python version = 3.12.12 (main, Jan 8 2026, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-14)] (/usr/bin/python3.12) jinja version = 3.1.6 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles statically imported: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_luks.yml ******************************************************* 1 plays in /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml PLAY [Test LUKS] *************************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:2 Sunday 18 January 2026 01:44:39 -0500 (0:00:00.431) 0:00:00.431 ******** [WARNING]: Platform linux on host managed-node9 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. ok: [managed-node9] TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:20 Sunday 18 January 2026 01:44:44 -0500 (0:00:05.406) 0:00:05.838 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:28 Sunday 18 January 2026 01:44:44 -0500 (0:00:00.237) 0:00:06.095 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Enable FIPS mode - 2] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:39 Sunday 18 January 2026 01:44:45 -0500 (0:00:00.162) 0:00:06.257 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Reboot - 2] ************************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:43 Sunday 18 January 2026 01:44:45 -0500 (0:00:00.186) 0:00:06.444 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Ensure dracut-fips] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:53 Sunday 18 January 2026 01:44:45 -0500 (0:00:00.217) 0:00:06.661 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Configure boot for FIPS] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:59 Sunday 18 January 2026 01:44:45 -0500 (0:00:00.238) 0:00:06.899 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Reboot - 3] ************************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:68 Sunday 18 January 2026 01:44:45 -0500 (0:00:00.221) 0:00:07.121 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:72 Sunday 18 January 2026 01:44:46 -0500 (0:00:00.247) 0:00:07.369 ******** included: fedora.linux_system_roles.storage for managed-node9 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 18 January 2026 01:44:46 -0500 (0:00:00.662) 0:00:08.031 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 18 January 2026 01:44:47 -0500 (0:00:00.276) 0:00:08.307 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 18 January 2026 01:44:47 -0500 (0:00:00.471) 0:00:08.779 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 18 January 2026 01:44:48 -0500 (0:00:00.437) 0:00:09.216 ******** ok: [managed-node9] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 18 January 2026 01:44:50 -0500 (0:00:02.501) 0:00:11.717 ******** ok: [managed-node9] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 18 January 2026 01:44:50 -0500 (0:00:00.258) 0:00:11.976 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 18 January 2026 01:44:51 -0500 (0:00:00.323) 0:00:12.300 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 18 January 2026 01:44:51 -0500 (0:00:00.189) 0:00:12.489 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 18 January 2026 01:44:51 -0500 (0:00:00.647) 0:00:13.136 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 18 January 2026 01:44:55 -0500 (0:00:04.064) 0:00:17.200 ******** ok: [managed-node9] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 18 January 2026 01:44:56 -0500 (0:00:00.506) 0:00:17.706 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 18 January 2026 01:44:57 -0500 (0:00:00.534) 0:00:18.241 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 18 January 2026 01:45:00 -0500 (0:00:03.590) 0:00:21.832 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 18 January 2026 01:45:01 -0500 (0:00:00.476) 0:00:22.309 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 18 January 2026 01:45:01 -0500 (0:00:00.545) 0:00:22.855 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 18 January 2026 01:45:02 -0500 (0:00:00.365) 0:00:23.220 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 18 January 2026 01:45:02 -0500 (0:00:00.264) 0:00:23.484 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 18 January 2026 01:45:04 -0500 (0:00:01.944) 0:00:25.429 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 18 January 2026 01:45:08 -0500 (0:00:04.611) 0:00:30.041 ******** ok: [managed-node9] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 18 January 2026 01:45:09 -0500 (0:00:00.790) 0:00:30.831 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 18 January 2026 01:45:09 -0500 (0:00:00.117) 0:00:30.949 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 18 January 2026 01:45:11 -0500 (0:00:01.853) 0:00:32.802 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 18 January 2026 01:45:11 -0500 (0:00:00.406) 0:00:33.209 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768718545.508246, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "af0e1329999499cb4878e3603f4054b5ab27f4ab", "ctime": 1768718543.973239, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 574619848, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1768718543.973239, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1408, "uid": 0, "version": "1014542963", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 18 January 2026 01:45:12 -0500 (0:00:00.919) 0:00:34.128 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "blivet_output is changed", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 18 January 2026 01:45:13 -0500 (0:00:00.217) 0:00:34.346 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 18 January 2026 01:45:13 -0500 (0:00:00.183) 0:00:34.530 ******** ok: [managed-node9] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 18 January 2026 01:45:13 -0500 (0:00:00.220) 0:00:34.750 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 18 January 2026 01:45:13 -0500 (0:00:00.198) 0:00:34.948 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 18 January 2026 01:45:13 -0500 (0:00:00.240) 0:00:35.188 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 18 January 2026 01:45:14 -0500 (0:00:00.592) 0:00:35.781 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "blivet_output['mounts'] | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 18 January 2026 01:45:15 -0500 (0:00:00.662) 0:00:36.444 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 18 January 2026 01:45:15 -0500 (0:00:00.620) 0:00:37.064 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 18 January 2026 01:45:16 -0500 (0:00:00.370) 0:00:37.434 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "blivet_output['mounts'] | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 18 January 2026 01:45:16 -0500 (0:00:00.510) 0:00:37.945 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768717403.7019038, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1767624696.987, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 4194436, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1767624397.527, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "275126202", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 18 January 2026 01:45:17 -0500 (0:00:01.213) 0:00:39.159 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 18 January 2026 01:45:18 -0500 (0:00:00.147) 0:00:39.307 ******** ok: [managed-node9] TASK [Get unused disks] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:76 Sunday 18 January 2026 01:45:19 -0500 (0:00:01.735) 0:00:41.042 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed-node9 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Sunday 18 January 2026 01:45:20 -0500 (0:00:00.447) 0:00:41.489 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: util-linux-core TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Sunday 18 January 2026 01:45:22 -0500 (0:00:02.323) 0:00:43.813 ******** ok: [managed-node9] => { "changed": false, "disks": [ "sda" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG_SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG_SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'type': 'disk', 'size': '268435456000', 'fstype': '', 'ssize': '512'}] has partitions" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Sunday 18 January 2026 01:45:26 -0500 (0:00:03.819) 0:00:47.633 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "'Unable to find unused disk' in unused_disks_return.disks", "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Sunday 18 January 2026 01:45:26 -0500 (0:00:00.159) 0:00:47.793 ******** ok: [managed-node9] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Sunday 18 January 2026 01:45:26 -0500 (0:00:00.285) 0:00:48.078 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "unused_disks | d([]) | length < disks_needed | d(1)", "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Sunday 18 January 2026 01:45:27 -0500 (0:00:00.615) 0:00:48.694 ******** ok: [managed-node9] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:85 Sunday 18 January 2026 01:45:27 -0500 (0:00:00.233) 0:00:48.927 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node9 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Sunday 18 January 2026 01:45:28 -0500 (0:00:00.496) 0:00:49.423 ******** ok: [managed-node9] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Sunday 18 January 2026 01:45:28 -0500 (0:00:00.650) 0:00:50.074 ******** included: fedora.linux_system_roles.storage for managed-node9 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 18 January 2026 01:45:29 -0500 (0:00:00.433) 0:00:50.508 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 18 January 2026 01:45:29 -0500 (0:00:00.378) 0:00:50.886 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 18 January 2026 01:45:30 -0500 (0:00:00.518) 0:00:51.408 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 18 January 2026 01:45:30 -0500 (0:00:00.602) 0:00:52.011 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 18 January 2026 01:45:31 -0500 (0:00:00.258) 0:00:52.270 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 18 January 2026 01:45:31 -0500 (0:00:00.250) 0:00:52.521 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 18 January 2026 01:45:31 -0500 (0:00:00.200) 0:00:52.721 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 18 January 2026 01:45:31 -0500 (0:00:00.148) 0:00:52.870 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 18 January 2026 01:45:32 -0500 (0:00:00.695) 0:00:53.565 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 18 January 2026 01:45:34 -0500 (0:00:02.071) 0:00:55.636 ******** ok: [managed-node9] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 18 January 2026 01:45:34 -0500 (0:00:00.523) 0:00:56.160 ******** ok: [managed-node9] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 18 January 2026 01:45:35 -0500 (0:00:00.565) 0:00:56.725 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 18 January 2026 01:45:37 -0500 (0:00:02.279) 0:00:59.005 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 18 January 2026 01:45:38 -0500 (0:00:00.461) 0:00:59.466 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 18 January 2026 01:45:38 -0500 (0:00:00.432) 0:00:59.899 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 18 January 2026 01:45:39 -0500 (0:00:00.426) 0:01:00.326 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 18 January 2026 01:45:39 -0500 (0:00:00.441) 0:01:00.767 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 18 January 2026 01:45:41 -0500 (0:00:02.118) 0:01:02.886 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 18 January 2026 01:45:44 -0500 (0:00:02.861) 0:01:05.748 ******** ok: [managed-node9] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 18 January 2026 01:45:45 -0500 (0:00:00.809) 0:01:06.557 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 18 January 2026 01:45:45 -0500 (0:00:00.151) 0:01:06.709 ******** fatal: [managed-node9]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Sunday 18 January 2026 01:45:47 -0500 (0:00:02.267) 0:01:08.976 ******** fatal: [managed-node9]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "encrypted volume 'foo' missing key/password", 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 18 January 2026 01:45:48 -0500 (0:00:00.350) 0:01:09.327 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Sunday 18 January 2026 01:45:48 -0500 (0:00:00.140) 0:01:09.468 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Sunday 18 January 2026 01:45:48 -0500 (0:00:00.245) 0:01:09.714 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Sunday 18 January 2026 01:45:48 -0500 (0:00:00.358) 0:01:10.072 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:100 Sunday 18 January 2026 01:45:49 -0500 (0:00:00.275) 0:01:10.348 ******** included: fedora.linux_system_roles.storage for managed-node9 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 18 January 2026 01:45:49 -0500 (0:00:00.641) 0:01:10.989 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 18 January 2026 01:45:50 -0500 (0:00:00.257) 0:01:11.247 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 18 January 2026 01:45:50 -0500 (0:00:00.602) 0:01:11.850 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 18 January 2026 01:45:51 -0500 (0:00:00.769) 0:01:12.619 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 18 January 2026 01:45:51 -0500 (0:00:00.272) 0:01:12.892 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 18 January 2026 01:45:51 -0500 (0:00:00.249) 0:01:13.141 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 18 January 2026 01:45:52 -0500 (0:00:00.281) 0:01:13.423 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 18 January 2026 01:45:52 -0500 (0:00:00.694) 0:01:14.134 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 18 January 2026 01:45:53 -0500 (0:00:00.622) 0:01:14.757 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 18 January 2026 01:45:55 -0500 (0:00:02.250) 0:01:17.007 ******** ok: [managed-node9] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 18 January 2026 01:45:56 -0500 (0:00:00.612) 0:01:17.620 ******** ok: [managed-node9] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 18 January 2026 01:45:57 -0500 (0:00:00.615) 0:01:18.235 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 18 January 2026 01:45:59 -0500 (0:00:02.495) 0:01:20.731 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 18 January 2026 01:45:59 -0500 (0:00:00.423) 0:01:21.154 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 18 January 2026 01:46:00 -0500 (0:00:00.469) 0:01:21.624 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 18 January 2026 01:46:00 -0500 (0:00:00.349) 0:01:21.974 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 18 January 2026 01:46:01 -0500 (0:00:00.546) 0:01:22.520 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 18 January 2026 01:46:03 -0500 (0:00:02.198) 0:01:24.719 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 18 January 2026 01:46:06 -0500 (0:00:02.990) 0:01:27.709 ******** ok: [managed-node9] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 18 January 2026 01:46:07 -0500 (0:00:00.763) 0:01:28.473 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 18 January 2026 01:46:07 -0500 (0:00:00.206) 0:01:28.679 ******** changed: [managed-node9] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 18 January 2026 01:46:20 -0500 (0:00:12.667) 0:01:41.347 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 18 January 2026 01:46:20 -0500 (0:00:00.461) 0:01:41.808 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768718545.508246, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "af0e1329999499cb4878e3603f4054b5ab27f4ab", "ctime": 1768718543.973239, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 574619848, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1768718543.973239, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1408, "uid": 0, "version": "1014542963", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 18 January 2026 01:46:21 -0500 (0:00:01.263) 0:01:43.071 ******** ok: [managed-node9] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 18 January 2026 01:46:24 -0500 (0:00:03.072) 0:01:46.144 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 18 January 2026 01:46:25 -0500 (0:00:00.145) 0:01:46.290 ******** ok: [managed-node9] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 18 January 2026 01:46:25 -0500 (0:00:00.288) 0:01:46.581 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 18 January 2026 01:46:25 -0500 (0:00:00.214) 0:01:46.795 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 18 January 2026 01:46:25 -0500 (0:00:00.241) 0:01:47.036 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 18 January 2026 01:46:26 -0500 (0:00:00.680) 0:01:47.717 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 18 January 2026 01:46:32 -0500 (0:00:06.195) 0:01:53.912 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node9] => (item={'src': '/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 18 January 2026 01:46:36 -0500 (0:00:03.624) 0:01:57.537 ******** skipping: [managed-node9] => (item={'src': '/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 18 January 2026 01:46:36 -0500 (0:00:00.617) 0:01:58.155 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 18 January 2026 01:46:38 -0500 (0:00:01.861) 0:02:00.016 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768717403.7019038, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1767624696.987, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 4194436, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1767624397.527, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "275126202", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 18 January 2026 01:46:39 -0500 (0:00:01.175) 0:02:01.192 ******** changed: [managed-node9] => (item={'backing_device': '/dev/sda', 'name': 'luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 18 January 2026 01:46:41 -0500 (0:00:01.556) 0:02:02.748 ******** ok: [managed-node9] TASK [Verify role results] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:112 Sunday 18 January 2026 01:46:43 -0500 (0:00:01.805) 0:02:04.554 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node9 TASK [Print out pool information] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Sunday 18 January 2026 01:46:43 -0500 (0:00:00.574) 0:02:05.128 ******** skipping: [managed-node9] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Sunday 18 January 2026 01:46:44 -0500 (0:00:00.477) 0:02:05.605 ******** ok: [managed-node9] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Sunday 18 January 2026 01:46:45 -0500 (0:00:00.615) 0:02:06.220 ******** ok: [managed-node9] => { "changed": false, "info": { "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "size": "10G", "type": "crypt", "uuid": "82879a6d-0587-49bb-a4a6-e737e6eb0bee" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "15e4a2f9-7dc9-4007-a374-efc5cd11b0eb" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "a072e5ad-b1ec-4900-abeb-5d4679ecfcd5" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Sunday 18 January 2026 01:46:48 -0500 (0:00:03.102) 0:02:09.323 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002966", "end": "2026-01-18 01:46:50.974237", "rc": 0, "start": "2026-01-18 01:46:50.971271" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Mon Jan 5 14:46:37 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=a072e5ad-b1ec-4900-abeb-5d4679ecfcd5 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Sunday 18 January 2026 01:46:51 -0500 (0:00:03.005) 0:02:12.328 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003641", "end": "2026-01-18 01:46:52.106184", "failed_when_result": false, "rc": 0, "start": "2026-01-18 01:46:52.102543" } STDOUT: luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Sunday 18 January 2026 01:46:52 -0500 (0:00:01.147) 0:02:13.475 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Sunday 18 January 2026 01:46:52 -0500 (0:00:00.489) 0:02:13.965 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node9 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb', '_raw_device': '/dev/sda', '_mount_id': '/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Sunday 18 January 2026 01:46:53 -0500 (0:00:00.800) 0:02:14.765 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Sunday 18 January 2026 01:46:54 -0500 (0:00:00.574) 0:02:15.340 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node9 => (item=mount) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node9 => (item=fstab) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node9 => (item=fs) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node9 => (item=device) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node9 => (item=encryption) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node9 => (item=md) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node9 => (item=size) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node9 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Sunday 18 January 2026 01:46:56 -0500 (0:00:02.008) 0:02:17.348 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Sunday 18 January 2026 01:46:56 -0500 (0:00:00.323) 0:02:17.672 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Sunday 18 January 2026 01:46:57 -0500 (0:00:00.746) 0:02:18.418 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Sunday 18 January 2026 01:46:57 -0500 (0:00:00.714) 0:02:19.133 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Sunday 18 January 2026 01:46:58 -0500 (0:00:00.270) 0:02:19.404 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Sunday 18 January 2026 01:46:58 -0500 (0:00:00.587) 0:02:19.991 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Sunday 18 January 2026 01:46:59 -0500 (0:00:00.706) 0:02:20.698 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Sunday 18 January 2026 01:47:00 -0500 (0:00:00.696) 0:02:21.395 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Sunday 18 January 2026 01:47:00 -0500 (0:00:00.219) 0:02:21.614 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Sunday 18 January 2026 01:47:00 -0500 (0:00:00.171) 0:02:21.786 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Sunday 18 January 2026 01:47:00 -0500 (0:00:00.141) 0:02:21.927 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Sunday 18 January 2026 01:47:00 -0500 (0:00:00.102) 0:02:22.030 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Sunday 18 January 2026 01:47:01 -0500 (0:00:00.968) 0:02:22.998 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Sunday 18 January 2026 01:47:02 -0500 (0:00:00.574) 0:02:23.573 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Sunday 18 January 2026 01:47:03 -0500 (0:00:00.673) 0:02:24.247 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Sunday 18 January 2026 01:47:03 -0500 (0:00:00.586) 0:02:24.834 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Sunday 18 January 2026 01:47:04 -0500 (0:00:00.698) 0:02:25.532 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Sunday 18 January 2026 01:47:04 -0500 (0:00:00.250) 0:02:25.783 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Sunday 18 January 2026 01:47:05 -0500 (0:00:00.646) 0:02:26.430 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Sunday 18 January 2026 01:47:05 -0500 (0:00:00.714) 0:02:27.144 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768718779.4413395, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1768718779.4413395, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 447, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1768718779.4413395, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Sunday 18 January 2026 01:47:07 -0500 (0:00:01.104) 0:02:28.249 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Sunday 18 January 2026 01:47:07 -0500 (0:00:00.343) 0:02:28.592 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Sunday 18 January 2026 01:47:07 -0500 (0:00:00.137) 0:02:28.730 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Sunday 18 January 2026 01:47:07 -0500 (0:00:00.265) 0:02:28.995 ******** ok: [managed-node9] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Sunday 18 January 2026 01:47:08 -0500 (0:00:00.242) 0:02:29.238 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Sunday 18 January 2026 01:47:08 -0500 (0:00:00.213) 0:02:29.451 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Sunday 18 January 2026 01:47:08 -0500 (0:00:00.298) 0:02:29.750 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768718779.9023416, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1768718779.9023416, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1019, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1768718779.9023416, "nlink": 1, "path": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Sunday 18 January 2026 01:47:09 -0500 (0:00:01.227) 0:02:30.978 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Sunday 18 January 2026 01:47:12 -0500 (0:00:02.282) 0:02:33.261 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.007353", "end": "2026-01-18 01:47:13.041198", "rc": 0, "start": "2026-01-18 01:47:13.033845" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 15e4a2f9-7dc9-4007-a374-efc5cd11b0eb Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 668008 Threads: 2 Salt: 26 4e ca a5 a4 aa 0c 74 ad 4c ec 80 82 07 f7 a7 51 36 af c0 f5 b6 43 be 6a 0c dd 81 0e 5b 38 2a AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 133338 Salt: 51 8a 1a c1 ad 66 73 3d c7 76 73 92 d4 87 37 8a 5b fb 71 ff ce ef b6 67 fc 6e 39 32 49 f9 8c 11 Digest: 32 56 90 6f 29 f8 15 b2 82 6a b7 b7 20 70 1d 4d 99 61 8b b1 a5 3c e3 3b 7b bb 22 b1 18 97 a4 95 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Sunday 18 January 2026 01:47:13 -0500 (0:00:01.167) 0:02:34.428 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Sunday 18 January 2026 01:47:13 -0500 (0:00:00.508) 0:02:34.937 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Sunday 18 January 2026 01:47:14 -0500 (0:00:00.643) 0:02:35.581 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Sunday 18 January 2026 01:47:14 -0500 (0:00:00.318) 0:02:35.900 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Sunday 18 January 2026 01:47:14 -0500 (0:00:00.263) 0:02:36.163 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.encryption_luks_version is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Sunday 18 January 2026 01:47:15 -0500 (0:00:00.339) 0:02:36.502 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Sunday 18 January 2026 01:47:15 -0500 (0:00:00.285) 0:02:36.788 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Sunday 18 January 2026 01:47:15 -0500 (0:00:00.264) 0:02:37.052 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Sunday 18 January 2026 01:47:16 -0500 (0:00:00.723) 0:02:37.775 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Sunday 18 January 2026 01:47:17 -0500 (0:00:00.855) 0:02:38.631 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Sunday 18 January 2026 01:47:18 -0500 (0:00:00.593) 0:02:39.224 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Sunday 18 January 2026 01:47:18 -0500 (0:00:00.516) 0:02:39.741 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Sunday 18 January 2026 01:47:19 -0500 (0:00:00.507) 0:02:40.248 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Sunday 18 January 2026 01:47:19 -0500 (0:00:00.252) 0:02:40.501 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Sunday 18 January 2026 01:47:19 -0500 (0:00:00.142) 0:02:40.644 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Sunday 18 January 2026 01:47:19 -0500 (0:00:00.186) 0:02:40.830 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Sunday 18 January 2026 01:47:19 -0500 (0:00:00.230) 0:02:41.061 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Sunday 18 January 2026 01:47:20 -0500 (0:00:00.210) 0:02:41.271 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Sunday 18 January 2026 01:47:20 -0500 (0:00:00.216) 0:02:41.487 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Sunday 18 January 2026 01:47:20 -0500 (0:00:00.168) 0:02:41.655 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Sunday 18 January 2026 01:47:20 -0500 (0:00:00.170) 0:02:41.825 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Sunday 18 January 2026 01:47:20 -0500 (0:00:00.281) 0:02:42.107 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Sunday 18 January 2026 01:47:21 -0500 (0:00:00.170) 0:02:42.278 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Sunday 18 January 2026 01:47:21 -0500 (0:00:00.160) 0:02:42.438 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Sunday 18 January 2026 01:47:21 -0500 (0:00:00.440) 0:02:42.879 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Sunday 18 January 2026 01:47:22 -0500 (0:00:00.393) 0:02:43.272 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Sunday 18 January 2026 01:47:22 -0500 (0:00:00.506) 0:02:43.779 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Sunday 18 January 2026 01:47:22 -0500 (0:00:00.247) 0:02:44.027 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Sunday 18 January 2026 01:47:23 -0500 (0:00:00.499) 0:02:44.527 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Sunday 18 January 2026 01:47:23 -0500 (0:00:00.474) 0:02:45.001 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Sunday 18 January 2026 01:47:24 -0500 (0:00:00.528) 0:02:45.529 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Sunday 18 January 2026 01:47:24 -0500 (0:00:00.634) 0:02:46.163 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Sunday 18 January 2026 01:47:26 -0500 (0:00:01.103) 0:02:47.267 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Sunday 18 January 2026 01:47:26 -0500 (0:00:00.272) 0:02:47.539 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Sunday 18 January 2026 01:47:27 -0500 (0:00:00.770) 0:02:48.310 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Sunday 18 January 2026 01:47:27 -0500 (0:00:00.326) 0:02:48.637 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Sunday 18 January 2026 01:47:27 -0500 (0:00:00.235) 0:02:48.873 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Sunday 18 January 2026 01:47:27 -0500 (0:00:00.257) 0:02:49.130 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Sunday 18 January 2026 01:47:28 -0500 (0:00:00.262) 0:02:49.393 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Sunday 18 January 2026 01:47:28 -0500 (0:00:00.201) 0:02:49.595 ******** skipping: [managed-node9] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Sunday 18 January 2026 01:47:28 -0500 (0:00:00.222) 0:02:49.817 ******** skipping: [managed-node9] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Sunday 18 January 2026 01:47:28 -0500 (0:00:00.243) 0:02:50.087 ******** skipping: [managed-node9] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Sunday 18 January 2026 01:47:29 -0500 (0:00:00.214) 0:02:50.301 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Sunday 18 January 2026 01:47:29 -0500 (0:00:00.227) 0:02:50.528 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Sunday 18 January 2026 01:47:29 -0500 (0:00:00.275) 0:02:50.804 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Sunday 18 January 2026 01:47:29 -0500 (0:00:00.197) 0:02:51.002 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Sunday 18 January 2026 01:47:30 -0500 (0:00:00.376) 0:02:51.378 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Sunday 18 January 2026 01:47:30 -0500 (0:00:00.229) 0:02:51.608 ******** ok: [managed-node9] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Sunday 18 January 2026 01:47:30 -0500 (0:00:00.254) 0:02:51.862 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Sunday 18 January 2026 01:47:30 -0500 (0:00:00.181) 0:02:52.044 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Sunday 18 January 2026 01:47:31 -0500 (0:00:00.445) 0:02:52.490 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Sunday 18 January 2026 01:47:31 -0500 (0:00:00.177) 0:02:52.667 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Sunday 18 January 2026 01:47:31 -0500 (0:00:00.132) 0:02:52.800 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Sunday 18 January 2026 01:47:31 -0500 (0:00:00.174) 0:02:52.974 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Sunday 18 January 2026 01:47:31 -0500 (0:00:00.176) 0:02:53.150 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Sunday 18 January 2026 01:47:32 -0500 (0:00:00.187) 0:02:53.338 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Sunday 18 January 2026 01:47:32 -0500 (0:00:00.232) 0:02:53.570 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Sunday 18 January 2026 01:47:32 -0500 (0:00:00.210) 0:02:53.781 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Sunday 18 January 2026 01:47:32 -0500 (0:00:00.231) 0:02:54.013 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Sunday 18 January 2026 01:47:32 -0500 (0:00:00.170) 0:02:54.184 ******** changed: [managed-node9] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:118 Sunday 18 January 2026 01:47:36 -0500 (0:00:03.505) 0:02:57.689 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node9 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Sunday 18 January 2026 01:47:37 -0500 (0:00:00.722) 0:02:58.412 ******** ok: [managed-node9] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Sunday 18 January 2026 01:47:37 -0500 (0:00:00.610) 0:02:59.023 ******** included: fedora.linux_system_roles.storage for managed-node9 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 18 January 2026 01:47:38 -0500 (0:00:00.318) 0:02:59.341 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 18 January 2026 01:47:38 -0500 (0:00:00.259) 0:02:59.601 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 18 January 2026 01:47:38 -0500 (0:00:00.460) 0:03:00.061 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 18 January 2026 01:47:39 -0500 (0:00:00.779) 0:03:00.841 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 18 January 2026 01:47:39 -0500 (0:00:00.215) 0:03:01.056 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 18 January 2026 01:47:40 -0500 (0:00:00.257) 0:03:01.314 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 18 January 2026 01:47:40 -0500 (0:00:00.192) 0:03:01.506 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 18 January 2026 01:47:40 -0500 (0:00:00.181) 0:03:01.687 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 18 January 2026 01:47:41 -0500 (0:00:00.911) 0:03:02.599 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 18 January 2026 01:47:43 -0500 (0:00:02.353) 0:03:04.953 ******** ok: [managed-node9] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 18 January 2026 01:47:44 -0500 (0:00:00.587) 0:03:05.540 ******** ok: [managed-node9] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 18 January 2026 01:47:44 -0500 (0:00:00.642) 0:03:06.182 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 18 January 2026 01:47:47 -0500 (0:00:02.353) 0:03:08.536 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 18 January 2026 01:47:47 -0500 (0:00:00.517) 0:03:09.054 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 18 January 2026 01:47:48 -0500 (0:00:00.539) 0:03:09.594 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 18 January 2026 01:47:48 -0500 (0:00:00.472) 0:03:10.067 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 18 January 2026 01:47:49 -0500 (0:00:00.431) 0:03:10.498 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 18 January 2026 01:47:51 -0500 (0:00:02.078) 0:03:12.576 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 18 January 2026 01:47:54 -0500 (0:00:02.980) 0:03:15.557 ******** ok: [managed-node9] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 18 January 2026 01:47:55 -0500 (0:00:00.748) 0:03:16.306 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 18 January 2026 01:47:55 -0500 (0:00:00.198) 0:03:16.504 ******** fatal: [managed-node9]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Sunday 18 January 2026 01:47:57 -0500 (0:00:02.504) 0:03:19.009 ******** fatal: [managed-node9]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb' in safe mode due to encryption removal", 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10720641024, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 18 January 2026 01:47:58 -0500 (0:00:00.414) 0:03:19.423 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Sunday 18 January 2026 01:47:58 -0500 (0:00:00.134) 0:03:19.558 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Sunday 18 January 2026 01:47:58 -0500 (0:00:00.246) 0:03:19.804 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Sunday 18 January 2026 01:47:59 -0500 (0:00:00.448) 0:03:20.252 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Sunday 18 January 2026 01:47:59 -0500 (0:00:00.235) 0:03:20.488 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768718856.336699, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1768718856.336699, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1768718856.336699, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1888315594", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Sunday 18 January 2026 01:48:00 -0500 (0:00:01.235) 0:03:21.723 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:138 Sunday 18 January 2026 01:48:00 -0500 (0:00:00.343) 0:03:22.067 ******** included: fedora.linux_system_roles.storage for managed-node9 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 18 January 2026 01:48:01 -0500 (0:00:01.045) 0:03:23.112 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 18 January 2026 01:48:02 -0500 (0:00:00.308) 0:03:23.420 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 18 January 2026 01:48:02 -0500 (0:00:00.561) 0:03:23.982 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 18 January 2026 01:48:03 -0500 (0:00:00.772) 0:03:24.755 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 18 January 2026 01:48:03 -0500 (0:00:00.285) 0:03:25.040 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 18 January 2026 01:48:04 -0500 (0:00:00.243) 0:03:25.284 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 18 January 2026 01:48:04 -0500 (0:00:00.267) 0:03:25.552 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 18 January 2026 01:48:04 -0500 (0:00:00.198) 0:03:25.750 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 18 January 2026 01:48:05 -0500 (0:00:00.780) 0:03:26.530 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 18 January 2026 01:48:07 -0500 (0:00:02.401) 0:03:28.932 ******** ok: [managed-node9] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 18 January 2026 01:48:08 -0500 (0:00:00.587) 0:03:29.519 ******** ok: [managed-node9] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 18 January 2026 01:48:08 -0500 (0:00:00.682) 0:03:30.201 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 18 January 2026 01:48:11 -0500 (0:00:02.393) 0:03:32.595 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 18 January 2026 01:48:11 -0500 (0:00:00.549) 0:03:33.144 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 18 January 2026 01:48:12 -0500 (0:00:00.431) 0:03:33.576 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 18 January 2026 01:48:12 -0500 (0:00:00.555) 0:03:34.131 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 18 January 2026 01:48:13 -0500 (0:00:00.434) 0:03:34.566 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 18 January 2026 01:48:15 -0500 (0:00:02.027) 0:03:36.594 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 18 January 2026 01:48:18 -0500 (0:00:02.883) 0:03:39.477 ******** ok: [managed-node9] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 18 January 2026 01:48:18 -0500 (0:00:00.666) 0:03:40.144 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 18 January 2026 01:48:19 -0500 (0:00:00.149) 0:03:40.293 ******** changed: [managed-node9] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=68f6c65c-d26d-4019-bfff-78a69e3db92b", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=68f6c65c-d26d-4019-bfff-78a69e3db92b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 18 January 2026 01:48:21 -0500 (0:00:02.739) 0:03:43.032 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 18 January 2026 01:48:22 -0500 (0:00:00.571) 0:03:43.604 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768718796.1404176, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "b7b54e64ed6442882cf35b8c59886c63a80cf3e1", "ctime": 1768718796.1364176, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 574619848, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1768718796.1364176, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1490, "uid": 0, "version": "1014542963", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 18 January 2026 01:48:23 -0500 (0:00:01.175) 0:03:44.780 ******** ok: [managed-node9] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 18 January 2026 01:48:24 -0500 (0:00:01.219) 0:03:45.999 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 18 January 2026 01:48:24 -0500 (0:00:00.146) 0:03:46.146 ******** ok: [managed-node9] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=68f6c65c-d26d-4019-bfff-78a69e3db92b", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=68f6c65c-d26d-4019-bfff-78a69e3db92b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 18 January 2026 01:48:25 -0500 (0:00:00.414) 0:03:46.560 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 18 January 2026 01:48:25 -0500 (0:00:00.367) 0:03:46.928 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=68f6c65c-d26d-4019-bfff-78a69e3db92b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 18 January 2026 01:48:25 -0500 (0:00:00.259) 0:03:47.187 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node9] => (item={'src': '/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 18 January 2026 01:48:27 -0500 (0:00:01.789) 0:03:48.976 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 18 January 2026 01:48:29 -0500 (0:00:01.888) 0:03:50.865 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node9] => (item={'src': 'UUID=68f6c65c-d26d-4019-bfff-78a69e3db92b', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=68f6c65c-d26d-4019-bfff-78a69e3db92b", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=68f6c65c-d26d-4019-bfff-78a69e3db92b" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 18 January 2026 01:48:31 -0500 (0:00:01.758) 0:03:52.623 ******** skipping: [managed-node9] => (item={'src': 'UUID=68f6c65c-d26d-4019-bfff-78a69e3db92b', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=68f6c65c-d26d-4019-bfff-78a69e3db92b", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 18 January 2026 01:48:32 -0500 (0:00:00.645) 0:03:53.269 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 18 January 2026 01:48:33 -0500 (0:00:01.793) 0:03:55.063 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768718812.1044922, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "28b5b58604acd76590a2f5504b933c19a46391e7", "ctime": 1768718801.3904421, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 79691975, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1768718801.391957, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "1211537292", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 18 January 2026 01:48:35 -0500 (0:00:01.219) 0:03:56.283 ******** changed: [managed-node9] => (item={'backing_device': '/dev/sda', 'name': 'luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 18 January 2026 01:48:36 -0500 (0:00:01.677) 0:03:57.960 ******** ok: [managed-node9] TASK [Verify role results - 2] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:151 Sunday 18 January 2026 01:48:38 -0500 (0:00:01.763) 0:03:59.724 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node9 TASK [Print out pool information] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Sunday 18 January 2026 01:48:39 -0500 (0:00:01.190) 0:04:00.914 ******** skipping: [managed-node9] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Sunday 18 January 2026 01:48:40 -0500 (0:00:00.640) 0:04:01.555 ******** ok: [managed-node9] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=68f6c65c-d26d-4019-bfff-78a69e3db92b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Sunday 18 January 2026 01:48:41 -0500 (0:00:00.656) 0:04:02.212 ******** ok: [managed-node9] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "68f6c65c-d26d-4019-bfff-78a69e3db92b" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "a072e5ad-b1ec-4900-abeb-5d4679ecfcd5" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Sunday 18 January 2026 01:48:42 -0500 (0:00:01.196) 0:04:03.408 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003074", "end": "2026-01-18 01:48:43.287794", "rc": 0, "start": "2026-01-18 01:48:43.284720" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Mon Jan 5 14:46:37 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=a072e5ad-b1ec-4900-abeb-5d4679ecfcd5 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 UUID=68f6c65c-d26d-4019-bfff-78a69e3db92b /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Sunday 18 January 2026 01:48:43 -0500 (0:00:01.260) 0:04:04.669 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003111", "end": "2026-01-18 01:48:44.577762", "failed_when_result": false, "rc": 0, "start": "2026-01-18 01:48:44.574651" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Sunday 18 January 2026 01:48:44 -0500 (0:00:01.278) 0:04:05.947 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Sunday 18 January 2026 01:48:45 -0500 (0:00:00.450) 0:04:06.398 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node9 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10720641024, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/sda', '_raw_device': '/dev/sda', '_mount_id': 'UUID=68f6c65c-d26d-4019-bfff-78a69e3db92b', '_kernel_device': '/dev/sda', '_raw_kernel_device': '/dev/sda'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Sunday 18 January 2026 01:48:45 -0500 (0:00:00.741) 0:04:07.139 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Sunday 18 January 2026 01:48:46 -0500 (0:00:00.646) 0:04:07.786 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node9 => (item=mount) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node9 => (item=fstab) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node9 => (item=fs) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node9 => (item=device) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node9 => (item=encryption) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node9 => (item=md) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node9 => (item=size) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node9 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Sunday 18 January 2026 01:48:48 -0500 (0:00:01.602) 0:04:09.388 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Sunday 18 January 2026 01:48:48 -0500 (0:00:00.261) 0:04:09.649 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Sunday 18 January 2026 01:48:48 -0500 (0:00:00.463) 0:04:10.113 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Sunday 18 January 2026 01:48:49 -0500 (0:00:00.687) 0:04:10.800 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Sunday 18 January 2026 01:48:49 -0500 (0:00:00.337) 0:04:11.138 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Sunday 18 January 2026 01:48:50 -0500 (0:00:00.701) 0:04:11.840 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Sunday 18 January 2026 01:48:51 -0500 (0:00:00.718) 0:04:12.558 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Sunday 18 January 2026 01:48:51 -0500 (0:00:00.595) 0:04:13.154 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Sunday 18 January 2026 01:48:52 -0500 (0:00:00.180) 0:04:13.334 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Sunday 18 January 2026 01:48:52 -0500 (0:00:00.192) 0:04:13.527 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Sunday 18 January 2026 01:48:52 -0500 (0:00:00.239) 0:04:13.767 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Sunday 18 January 2026 01:48:52 -0500 (0:00:00.240) 0:04:14.007 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=68f6c65c-d26d-4019-bfff-78a69e3db92b " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Sunday 18 January 2026 01:48:53 -0500 (0:00:01.054) 0:04:15.062 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Sunday 18 January 2026 01:48:54 -0500 (0:00:00.645) 0:04:15.707 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Sunday 18 January 2026 01:48:55 -0500 (0:00:00.598) 0:04:16.306 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Sunday 18 January 2026 01:48:55 -0500 (0:00:00.566) 0:04:16.872 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Sunday 18 January 2026 01:48:56 -0500 (0:00:00.793) 0:04:17.666 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Sunday 18 January 2026 01:48:56 -0500 (0:00:00.189) 0:04:17.856 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Sunday 18 January 2026 01:48:57 -0500 (0:00:00.596) 0:04:18.452 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Sunday 18 January 2026 01:48:57 -0500 (0:00:00.666) 0:04:19.119 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768718901.5719104, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1768718901.5719104, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 447, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1768718901.5719104, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Sunday 18 January 2026 01:48:59 -0500 (0:00:01.203) 0:04:20.322 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Sunday 18 January 2026 01:48:59 -0500 (0:00:00.257) 0:04:20.579 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Sunday 18 January 2026 01:48:59 -0500 (0:00:00.127) 0:04:20.707 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Sunday 18 January 2026 01:48:59 -0500 (0:00:00.376) 0:04:21.084 ******** ok: [managed-node9] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Sunday 18 January 2026 01:49:00 -0500 (0:00:00.240) 0:04:21.325 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Sunday 18 January 2026 01:49:00 -0500 (0:00:00.205) 0:04:21.530 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Sunday 18 January 2026 01:49:00 -0500 (0:00:00.322) 0:04:21.852 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Sunday 18 January 2026 01:49:00 -0500 (0:00:00.161) 0:04:22.013 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Sunday 18 January 2026 01:49:02 -0500 (0:00:02.183) 0:04:24.197 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Sunday 18 January 2026 01:49:03 -0500 (0:00:00.343) 0:04:24.540 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Sunday 18 January 2026 01:49:03 -0500 (0:00:00.191) 0:04:24.732 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Sunday 18 January 2026 01:49:04 -0500 (0:00:00.574) 0:04:25.306 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Sunday 18 January 2026 01:49:04 -0500 (0:00:00.181) 0:04:25.488 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Sunday 18 January 2026 01:49:04 -0500 (0:00:00.200) 0:04:25.688 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Sunday 18 January 2026 01:49:04 -0500 (0:00:00.263) 0:04:25.952 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Sunday 18 January 2026 01:49:04 -0500 (0:00:00.238) 0:04:26.190 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Sunday 18 January 2026 01:49:05 -0500 (0:00:00.227) 0:04:26.418 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Sunday 18 January 2026 01:49:05 -0500 (0:00:00.741) 0:04:27.159 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Sunday 18 January 2026 01:49:06 -0500 (0:00:00.638) 0:04:27.798 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Sunday 18 January 2026 01:49:07 -0500 (0:00:00.701) 0:04:28.499 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Sunday 18 January 2026 01:49:07 -0500 (0:00:00.692) 0:04:29.192 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Sunday 18 January 2026 01:49:08 -0500 (0:00:00.532) 0:04:29.724 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Sunday 18 January 2026 01:49:08 -0500 (0:00:00.253) 0:04:29.978 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Sunday 18 January 2026 01:49:08 -0500 (0:00:00.203) 0:04:30.181 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Sunday 18 January 2026 01:49:09 -0500 (0:00:00.175) 0:04:30.357 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Sunday 18 January 2026 01:49:09 -0500 (0:00:00.183) 0:04:30.540 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Sunday 18 January 2026 01:49:09 -0500 (0:00:00.189) 0:04:30.730 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Sunday 18 January 2026 01:49:09 -0500 (0:00:00.195) 0:04:30.925 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Sunday 18 January 2026 01:49:09 -0500 (0:00:00.253) 0:04:31.179 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Sunday 18 January 2026 01:49:10 -0500 (0:00:00.205) 0:04:31.384 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Sunday 18 January 2026 01:49:10 -0500 (0:00:00.198) 0:04:31.583 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Sunday 18 January 2026 01:49:10 -0500 (0:00:00.206) 0:04:31.789 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Sunday 18 January 2026 01:49:10 -0500 (0:00:00.268) 0:04:32.058 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Sunday 18 January 2026 01:49:11 -0500 (0:00:00.557) 0:04:32.616 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Sunday 18 January 2026 01:49:11 -0500 (0:00:00.561) 0:04:33.177 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Sunday 18 January 2026 01:49:12 -0500 (0:00:00.528) 0:04:33.705 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Sunday 18 January 2026 01:49:12 -0500 (0:00:00.231) 0:04:33.936 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Sunday 18 January 2026 01:49:13 -0500 (0:00:00.569) 0:04:34.511 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Sunday 18 January 2026 01:49:13 -0500 (0:00:00.595) 0:04:35.107 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Sunday 18 January 2026 01:49:14 -0500 (0:00:00.471) 0:04:35.579 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Sunday 18 January 2026 01:49:14 -0500 (0:00:00.445) 0:04:36.025 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Sunday 18 January 2026 01:49:15 -0500 (0:00:00.520) 0:04:36.546 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Sunday 18 January 2026 01:49:15 -0500 (0:00:00.280) 0:04:36.826 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Sunday 18 January 2026 01:49:15 -0500 (0:00:00.229) 0:04:37.056 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Sunday 18 January 2026 01:49:16 -0500 (0:00:00.304) 0:04:37.360 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Sunday 18 January 2026 01:49:16 -0500 (0:00:00.239) 0:04:37.600 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Sunday 18 January 2026 01:49:16 -0500 (0:00:00.247) 0:04:37.847 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Sunday 18 January 2026 01:49:16 -0500 (0:00:00.242) 0:04:38.090 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Sunday 18 January 2026 01:49:17 -0500 (0:00:00.283) 0:04:38.373 ******** skipping: [managed-node9] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Sunday 18 January 2026 01:49:17 -0500 (0:00:00.312) 0:04:38.686 ******** skipping: [managed-node9] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Sunday 18 January 2026 01:49:17 -0500 (0:00:00.313) 0:04:39.000 ******** skipping: [managed-node9] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Sunday 18 January 2026 01:49:18 -0500 (0:00:00.299) 0:04:39.299 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Sunday 18 January 2026 01:49:18 -0500 (0:00:00.310) 0:04:39.610 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Sunday 18 January 2026 01:49:18 -0500 (0:00:00.326) 0:04:39.937 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Sunday 18 January 2026 01:49:18 -0500 (0:00:00.262) 0:04:40.200 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Sunday 18 January 2026 01:49:19 -0500 (0:00:00.301) 0:04:40.501 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Sunday 18 January 2026 01:49:19 -0500 (0:00:00.321) 0:04:40.822 ******** ok: [managed-node9] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Sunday 18 January 2026 01:49:19 -0500 (0:00:00.190) 0:04:41.013 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Sunday 18 January 2026 01:49:19 -0500 (0:00:00.174) 0:04:41.187 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Sunday 18 January 2026 01:49:20 -0500 (0:00:00.399) 0:04:41.587 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Sunday 18 January 2026 01:49:20 -0500 (0:00:00.234) 0:04:41.846 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Sunday 18 January 2026 01:49:20 -0500 (0:00:00.186) 0:04:42.033 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Sunday 18 January 2026 01:49:21 -0500 (0:00:00.194) 0:04:42.227 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Sunday 18 January 2026 01:49:21 -0500 (0:00:00.181) 0:04:42.408 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Sunday 18 January 2026 01:49:21 -0500 (0:00:00.216) 0:04:42.625 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Sunday 18 January 2026 01:49:21 -0500 (0:00:00.236) 0:04:42.861 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Sunday 18 January 2026 01:49:21 -0500 (0:00:00.241) 0:04:43.103 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Sunday 18 January 2026 01:49:22 -0500 (0:00:00.238) 0:04:43.342 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Sunday 18 January 2026 01:49:22 -0500 (0:00:00.310) 0:04:43.652 ******** changed: [managed-node9] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 2] ****************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:157 Sunday 18 January 2026 01:49:23 -0500 (0:00:01.283) 0:04:44.935 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node9 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Sunday 18 January 2026 01:49:24 -0500 (0:00:00.739) 0:04:45.675 ******** ok: [managed-node9] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Sunday 18 January 2026 01:49:25 -0500 (0:00:00.649) 0:04:46.324 ******** included: fedora.linux_system_roles.storage for managed-node9 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 18 January 2026 01:49:25 -0500 (0:00:00.452) 0:04:46.777 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 18 January 2026 01:49:25 -0500 (0:00:00.344) 0:04:47.121 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 18 January 2026 01:49:26 -0500 (0:00:00.739) 0:04:47.860 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 18 January 2026 01:49:27 -0500 (0:00:00.763) 0:04:48.624 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 18 January 2026 01:49:27 -0500 (0:00:00.345) 0:04:48.970 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 18 January 2026 01:49:28 -0500 (0:00:00.260) 0:04:49.230 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 18 January 2026 01:49:28 -0500 (0:00:00.237) 0:04:49.468 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 18 January 2026 01:49:28 -0500 (0:00:00.182) 0:04:49.650 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 18 January 2026 01:49:28 -0500 (0:00:00.545) 0:04:50.195 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 18 January 2026 01:49:31 -0500 (0:00:03.001) 0:04:53.197 ******** ok: [managed-node9] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 18 January 2026 01:49:33 -0500 (0:00:01.264) 0:04:54.461 ******** ok: [managed-node9] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 18 January 2026 01:49:35 -0500 (0:00:01.999) 0:04:56.460 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 18 January 2026 01:49:37 -0500 (0:00:02.469) 0:04:58.930 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 18 January 2026 01:49:38 -0500 (0:00:00.499) 0:04:59.429 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 18 January 2026 01:49:38 -0500 (0:00:00.492) 0:04:59.922 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 18 January 2026 01:49:39 -0500 (0:00:00.508) 0:05:00.431 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 18 January 2026 01:49:39 -0500 (0:00:00.486) 0:05:00.917 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 18 January 2026 01:49:41 -0500 (0:00:02.249) 0:05:03.167 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d15e4a2f9\\x2d7dc9\\x2d4007\\x2da374\\x2defc5cd11b0eb.service": { "name": "systemd-cryptsetup@luks\\x2d15e4a2f9\\x2d7dc9\\x2d4007\\x2da374\\x2defc5cd11b0eb.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 18 January 2026 01:49:44 -0500 (0:00:02.930) 0:05:06.098 ******** ok: [managed-node9] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d15e4a2f9\\x2d7dc9\\x2d4007\\x2da374\\x2defc5cd11b0eb.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 18 January 2026 01:49:45 -0500 (0:00:00.704) 0:05:06.802 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2d15e4a2f9\x2d7dc9\x2d4007\x2da374\x2defc5cd11b0eb.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d15e4a2f9\\x2d7dc9\\x2d4007\\x2da374\\x2defc5cd11b0eb.service", "name": "systemd-cryptsetup@luks\\x2d15e4a2f9\\x2d7dc9\\x2d4007\\x2da374\\x2defc5cd11b0eb.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "\"system-systemd\\\\x2dcryptsetup.slice\" systemd-udevd-kernel.socket cryptsetup-pre.target systemd-journald.socket dev-sda.device", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "\"blockdev@dev-mapper-luks\\\\x2d15e4a2f9\\\\x2d7dc9\\\\x2d4007\\\\x2da374\\\\x2defc5cd11b0eb.target\" cryptsetup.target umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb /dev/sda - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d15e4a2f9\\x2d7dc9\\x2d4007\\x2da374\\x2defc5cd11b0eb.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d15e4a2f9\\x2d7dc9\\x2d4007\\x2da374\\x2defc5cd11b0eb.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d15e4a2f9\\\\x2d7dc9\\\\x2d4007\\\\x2da374\\\\x2defc5cd11b0eb.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2026-01-18 01:48:33 EST", "StateChangeTimestampMonotonic": "10492042387", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d15e4a2f9\\\\x2d7dc9\\\\x2d4007\\\\x2da374\\\\x2defc5cd11b0eb.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 18 January 2026 01:49:47 -0500 (0:00:01.580) 0:05:08.382 ******** fatal: [managed-node9]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Sunday 18 January 2026 01:49:49 -0500 (0:00:02.312) 0:05:10.695 ******** fatal: [managed-node9]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 18 January 2026 01:49:49 -0500 (0:00:00.366) 0:05:11.062 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2d15e4a2f9\x2d7dc9\x2d4007\x2da374\x2defc5cd11b0eb.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d15e4a2f9\\x2d7dc9\\x2d4007\\x2da374\\x2defc5cd11b0eb.service", "name": "systemd-cryptsetup@luks\\x2d15e4a2f9\\x2d7dc9\\x2d4007\\x2da374\\x2defc5cd11b0eb.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d15e4a2f9\\x2d7dc9\\x2d4007\\x2da374\\x2defc5cd11b0eb.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d15e4a2f9\\x2d7dc9\\x2d4007\\x2da374\\x2defc5cd11b0eb.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d15e4a2f9\\x2d7dc9\\x2d4007\\x2da374\\x2defc5cd11b0eb.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454172672", "LimitMEMLOCKSoft": "454172672", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d15e4a2f9\\x2d7dc9\\x2d4007\\x2da374\\x2defc5cd11b0eb.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d15e4a2f9\\\\x2d7dc9\\\\x2d4007\\\\x2da374\\\\x2defc5cd11b0eb.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Sunday 18 January 2026 01:49:51 -0500 (0:00:01.580) 0:05:12.642 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Sunday 18 January 2026 01:49:51 -0500 (0:00:00.215) 0:05:12.857 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Sunday 18 January 2026 01:49:52 -0500 (0:00:00.376) 0:05:13.234 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Sunday 18 January 2026 01:49:52 -0500 (0:00:00.326) 0:05:13.561 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768718963.5732002, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1768718963.5732002, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1768718963.5732002, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2711446063", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Sunday 18 January 2026 01:49:53 -0500 (0:00:01.007) 0:05:14.569 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:177 Sunday 18 January 2026 01:49:53 -0500 (0:00:00.258) 0:05:14.827 ******** included: fedora.linux_system_roles.storage for managed-node9 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 18 January 2026 01:49:54 -0500 (0:00:00.959) 0:05:15.787 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 18 January 2026 01:49:54 -0500 (0:00:00.256) 0:05:16.043 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 18 January 2026 01:49:55 -0500 (0:00:00.616) 0:05:16.660 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 18 January 2026 01:49:56 -0500 (0:00:00.710) 0:05:17.370 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 18 January 2026 01:49:56 -0500 (0:00:00.267) 0:05:17.638 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 18 January 2026 01:49:56 -0500 (0:00:00.246) 0:05:17.885 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 18 January 2026 01:49:56 -0500 (0:00:00.218) 0:05:18.103 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 18 January 2026 01:49:57 -0500 (0:00:00.227) 0:05:18.331 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 18 January 2026 01:49:57 -0500 (0:00:00.544) 0:05:18.876 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 18 January 2026 01:49:59 -0500 (0:00:02.035) 0:05:20.912 ******** ok: [managed-node9] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 18 January 2026 01:50:00 -0500 (0:00:00.468) 0:05:21.381 ******** ok: [managed-node9] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 18 January 2026 01:50:00 -0500 (0:00:00.624) 0:05:22.005 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 18 January 2026 01:50:02 -0500 (0:00:01.957) 0:05:23.962 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 18 January 2026 01:50:03 -0500 (0:00:00.385) 0:05:24.348 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 18 January 2026 01:50:03 -0500 (0:00:00.394) 0:05:24.742 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 18 January 2026 01:50:03 -0500 (0:00:00.414) 0:05:25.157 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 18 January 2026 01:50:04 -0500 (0:00:00.336) 0:05:25.493 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 18 January 2026 01:50:06 -0500 (0:00:02.120) 0:05:27.613 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 18 January 2026 01:50:09 -0500 (0:00:03.074) 0:05:30.688 ******** ok: [managed-node9] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 18 January 2026 01:50:10 -0500 (0:00:00.948) 0:05:31.636 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 18 January 2026 01:50:10 -0500 (0:00:00.159) 0:05:31.796 ******** changed: [managed-node9] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=68f6c65c-d26d-4019-bfff-78a69e3db92b", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 18 January 2026 01:50:22 -0500 (0:00:12.101) 0:05:43.898 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 18 January 2026 01:50:23 -0500 (0:00:00.422) 0:05:44.320 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768718911.2599556, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "497704da186af81e7c2a75c4d77f401ba0486f52", "ctime": 1768718911.2559557, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 574619848, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1768718911.2559557, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1478, "uid": 0, "version": "1014542963", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 18 January 2026 01:50:24 -0500 (0:00:01.037) 0:05:45.358 ******** ok: [managed-node9] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 18 January 2026 01:50:25 -0500 (0:00:01.116) 0:05:46.474 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 18 January 2026 01:50:25 -0500 (0:00:00.140) 0:05:46.615 ******** ok: [managed-node9] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=68f6c65c-d26d-4019-bfff-78a69e3db92b", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 18 January 2026 01:50:25 -0500 (0:00:00.325) 0:05:46.941 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 18 January 2026 01:50:26 -0500 (0:00:00.287) 0:05:47.228 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 18 January 2026 01:50:26 -0500 (0:00:00.252) 0:05:47.480 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node9] => (item={'src': 'UUID=68f6c65c-d26d-4019-bfff-78a69e3db92b', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=68f6c65c-d26d-4019-bfff-78a69e3db92b", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=68f6c65c-d26d-4019-bfff-78a69e3db92b" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 18 January 2026 01:50:28 -0500 (0:00:01.810) 0:05:49.290 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 18 January 2026 01:50:30 -0500 (0:00:01.957) 0:05:51.247 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node9] => (item={'src': '/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 18 January 2026 01:50:31 -0500 (0:00:01.898) 0:05:53.146 ******** skipping: [managed-node9] => (item={'src': '/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 18 January 2026 01:50:32 -0500 (0:00:00.562) 0:05:53.709 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 18 January 2026 01:50:34 -0500 (0:00:01.984) 0:05:55.693 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768718924.5770178, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1768718916.5609803, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 318767303, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1768718916.561175, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1588100610", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 18 January 2026 01:50:35 -0500 (0:00:01.258) 0:05:56.952 ******** changed: [managed-node9] => (item={'backing_device': '/dev/sda', 'name': 'luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 18 January 2026 01:50:37 -0500 (0:00:01.662) 0:05:58.615 ******** ok: [managed-node9] TASK [Verify role results - 3] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:190 Sunday 18 January 2026 01:50:39 -0500 (0:00:02.095) 0:06:00.711 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node9 TASK [Print out pool information] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Sunday 18 January 2026 01:50:40 -0500 (0:00:00.678) 0:06:01.390 ******** skipping: [managed-node9] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Sunday 18 January 2026 01:50:40 -0500 (0:00:00.512) 0:06:01.903 ******** ok: [managed-node9] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Sunday 18 January 2026 01:50:41 -0500 (0:00:00.600) 0:06:02.503 ******** ok: [managed-node9] => { "changed": false, "info": { "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "size": "10G", "type": "crypt", "uuid": "3f1341fb-60e0-4f6b-b8f4-e61943c6c8e8" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "4eb8a886-6a8c-44cf-b518-9f5a2406116d" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "a072e5ad-b1ec-4900-abeb-5d4679ecfcd5" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Sunday 18 January 2026 01:50:42 -0500 (0:00:01.248) 0:06:03.752 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003404", "end": "2026-01-18 01:50:43.629490", "rc": 0, "start": "2026-01-18 01:50:43.626086" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Mon Jan 5 14:46:37 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=a072e5ad-b1ec-4900-abeb-5d4679ecfcd5 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Sunday 18 January 2026 01:50:43 -0500 (0:00:01.248) 0:06:05.001 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003109", "end": "2026-01-18 01:50:44.872529", "failed_when_result": false, "rc": 0, "start": "2026-01-18 01:50:44.869420" } STDOUT: luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Sunday 18 January 2026 01:50:44 -0500 (0:00:01.203) 0:06:06.204 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Sunday 18 January 2026 01:50:45 -0500 (0:00:00.532) 0:06:06.736 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node9 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d', '_raw_device': '/dev/sda', '_mount_id': '/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Sunday 18 January 2026 01:50:46 -0500 (0:00:00.753) 0:06:07.490 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Sunday 18 January 2026 01:50:47 -0500 (0:00:00.762) 0:06:08.253 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node9 => (item=mount) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node9 => (item=fstab) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node9 => (item=fs) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node9 => (item=device) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node9 => (item=encryption) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node9 => (item=md) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node9 => (item=size) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node9 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Sunday 18 January 2026 01:50:49 -0500 (0:00:02.359) 0:06:10.613 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Sunday 18 January 2026 01:50:49 -0500 (0:00:00.372) 0:06:10.985 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Sunday 18 January 2026 01:50:50 -0500 (0:00:00.682) 0:06:11.668 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Sunday 18 January 2026 01:50:51 -0500 (0:00:00.696) 0:06:12.364 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Sunday 18 January 2026 01:50:51 -0500 (0:00:00.433) 0:06:12.798 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Sunday 18 January 2026 01:50:52 -0500 (0:00:00.627) 0:06:13.425 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Sunday 18 January 2026 01:50:52 -0500 (0:00:00.707) 0:06:14.133 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Sunday 18 January 2026 01:50:53 -0500 (0:00:00.636) 0:06:14.770 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Sunday 18 January 2026 01:50:53 -0500 (0:00:00.185) 0:06:14.955 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Sunday 18 January 2026 01:50:53 -0500 (0:00:00.177) 0:06:15.132 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Sunday 18 January 2026 01:50:54 -0500 (0:00:00.166) 0:06:15.299 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Sunday 18 January 2026 01:50:54 -0500 (0:00:00.183) 0:06:15.495 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Sunday 18 January 2026 01:50:55 -0500 (0:00:00.891) 0:06:16.387 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Sunday 18 January 2026 01:50:55 -0500 (0:00:00.727) 0:06:17.114 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Sunday 18 January 2026 01:50:56 -0500 (0:00:00.795) 0:06:17.909 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Sunday 18 January 2026 01:50:57 -0500 (0:00:00.538) 0:06:18.448 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Sunday 18 January 2026 01:50:58 -0500 (0:00:00.877) 0:06:19.342 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Sunday 18 January 2026 01:50:58 -0500 (0:00:00.252) 0:06:19.595 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Sunday 18 January 2026 01:50:59 -0500 (0:00:00.676) 0:06:20.305 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Sunday 18 January 2026 01:50:59 -0500 (0:00:00.628) 0:06:20.933 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719022.0074735, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1768719022.0074735, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 447, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1768719022.0074735, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Sunday 18 January 2026 01:51:01 -0500 (0:00:01.425) 0:06:22.359 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Sunday 18 January 2026 01:51:01 -0500 (0:00:00.455) 0:06:22.814 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Sunday 18 January 2026 01:51:01 -0500 (0:00:00.247) 0:06:23.062 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Sunday 18 January 2026 01:51:02 -0500 (0:00:00.399) 0:06:23.462 ******** ok: [managed-node9] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Sunday 18 January 2026 01:51:02 -0500 (0:00:00.291) 0:06:23.753 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Sunday 18 January 2026 01:51:02 -0500 (0:00:00.198) 0:06:23.951 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Sunday 18 January 2026 01:51:02 -0500 (0:00:00.248) 0:06:24.200 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719022.4704754, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1768719022.4704754, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1147, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1768719022.4704754, "nlink": 1, "path": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Sunday 18 January 2026 01:51:04 -0500 (0:00:01.377) 0:06:25.578 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Sunday 18 January 2026 01:51:07 -0500 (0:00:03.128) 0:06:28.706 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.007032", "end": "2026-01-18 01:51:08.637815", "rc": 0, "start": "2026-01-18 01:51:08.630783" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 4eb8a886-6a8c-44cf-b518-9f5a2406116d Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 665346 Threads: 2 Salt: 49 25 c1 30 49 4c b9 ce 72 fc 8c 6d e5 a1 4c 7a d7 01 0d 2c 29 17 82 05 61 b0 90 a3 79 3d 5a bd AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 133338 Salt: 79 3a dd 39 54 3b 0c 87 b2 72 76 e0 be 45 d1 b6 69 10 7a f3 6b 8d 54 f7 41 50 bc fd 7f 1a 6c a5 Digest: 35 19 8b 50 c8 6e 3e 80 a5 f8 02 99 d6 80 3d 82 08 66 9a bf a7 87 aa eb 1a 75 59 50 bb 49 2a 22 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Sunday 18 January 2026 01:51:09 -0500 (0:00:02.202) 0:06:30.909 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Sunday 18 January 2026 01:51:10 -0500 (0:00:00.623) 0:06:31.532 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Sunday 18 January 2026 01:51:11 -0500 (0:00:00.724) 0:06:32.256 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Sunday 18 January 2026 01:51:11 -0500 (0:00:00.391) 0:06:32.647 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Sunday 18 January 2026 01:51:11 -0500 (0:00:00.305) 0:06:32.953 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.encryption_luks_version is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Sunday 18 January 2026 01:51:12 -0500 (0:00:00.299) 0:06:33.252 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Sunday 18 January 2026 01:51:12 -0500 (0:00:00.314) 0:06:33.567 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Sunday 18 January 2026 01:51:12 -0500 (0:00:00.301) 0:06:33.868 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Sunday 18 January 2026 01:51:13 -0500 (0:00:00.881) 0:06:34.749 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Sunday 18 January 2026 01:51:14 -0500 (0:00:00.539) 0:06:35.289 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Sunday 18 January 2026 01:51:14 -0500 (0:00:00.811) 0:06:36.101 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Sunday 18 January 2026 01:51:15 -0500 (0:00:00.662) 0:06:36.763 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Sunday 18 January 2026 01:51:16 -0500 (0:00:00.670) 0:06:37.434 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Sunday 18 January 2026 01:51:16 -0500 (0:00:00.189) 0:06:37.623 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Sunday 18 January 2026 01:51:16 -0500 (0:00:00.185) 0:06:37.809 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Sunday 18 January 2026 01:51:16 -0500 (0:00:00.142) 0:06:37.951 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Sunday 18 January 2026 01:51:16 -0500 (0:00:00.173) 0:06:38.125 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Sunday 18 January 2026 01:51:17 -0500 (0:00:00.228) 0:06:38.353 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Sunday 18 January 2026 01:51:17 -0500 (0:00:00.178) 0:06:38.532 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Sunday 18 January 2026 01:51:17 -0500 (0:00:00.198) 0:06:38.731 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Sunday 18 January 2026 01:51:17 -0500 (0:00:00.174) 0:06:38.906 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Sunday 18 January 2026 01:51:17 -0500 (0:00:00.173) 0:06:39.079 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Sunday 18 January 2026 01:51:18 -0500 (0:00:00.215) 0:06:39.295 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Sunday 18 January 2026 01:51:18 -0500 (0:00:00.173) 0:06:39.469 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Sunday 18 January 2026 01:51:18 -0500 (0:00:00.659) 0:06:40.129 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Sunday 18 January 2026 01:51:19 -0500 (0:00:00.558) 0:06:40.687 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Sunday 18 January 2026 01:51:20 -0500 (0:00:00.540) 0:06:41.227 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Sunday 18 January 2026 01:51:20 -0500 (0:00:00.240) 0:06:41.468 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Sunday 18 January 2026 01:51:20 -0500 (0:00:00.579) 0:06:42.048 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Sunday 18 January 2026 01:51:21 -0500 (0:00:00.513) 0:06:42.561 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Sunday 18 January 2026 01:51:21 -0500 (0:00:00.551) 0:06:43.112 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Sunday 18 January 2026 01:51:22 -0500 (0:00:00.525) 0:06:43.638 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Sunday 18 January 2026 01:51:22 -0500 (0:00:00.568) 0:06:44.207 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Sunday 18 January 2026 01:51:23 -0500 (0:00:00.260) 0:06:44.467 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Sunday 18 January 2026 01:51:23 -0500 (0:00:00.254) 0:06:44.722 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Sunday 18 January 2026 01:51:23 -0500 (0:00:00.295) 0:06:45.018 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Sunday 18 January 2026 01:51:24 -0500 (0:00:00.414) 0:06:45.433 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Sunday 18 January 2026 01:51:24 -0500 (0:00:00.249) 0:06:45.682 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Sunday 18 January 2026 01:51:24 -0500 (0:00:00.242) 0:06:45.925 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Sunday 18 January 2026 01:51:25 -0500 (0:00:00.331) 0:06:46.256 ******** skipping: [managed-node9] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Sunday 18 January 2026 01:51:25 -0500 (0:00:00.263) 0:06:46.520 ******** skipping: [managed-node9] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Sunday 18 January 2026 01:51:25 -0500 (0:00:00.251) 0:06:46.771 ******** skipping: [managed-node9] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Sunday 18 January 2026 01:51:25 -0500 (0:00:00.258) 0:06:47.030 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Sunday 18 January 2026 01:51:26 -0500 (0:00:00.242) 0:06:47.272 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Sunday 18 January 2026 01:51:26 -0500 (0:00:00.312) 0:06:47.585 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Sunday 18 January 2026 01:51:26 -0500 (0:00:00.265) 0:06:47.850 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Sunday 18 January 2026 01:51:26 -0500 (0:00:00.266) 0:06:48.117 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Sunday 18 January 2026 01:51:27 -0500 (0:00:00.266) 0:06:48.383 ******** ok: [managed-node9] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Sunday 18 January 2026 01:51:27 -0500 (0:00:00.343) 0:06:48.727 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Sunday 18 January 2026 01:51:27 -0500 (0:00:00.280) 0:06:49.008 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Sunday 18 January 2026 01:51:28 -0500 (0:00:00.549) 0:06:49.558 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Sunday 18 January 2026 01:51:28 -0500 (0:00:00.215) 0:06:49.773 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Sunday 18 January 2026 01:51:28 -0500 (0:00:00.219) 0:06:49.993 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Sunday 18 January 2026 01:51:29 -0500 (0:00:00.250) 0:06:50.244 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Sunday 18 January 2026 01:51:29 -0500 (0:00:00.223) 0:06:50.467 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Sunday 18 January 2026 01:51:29 -0500 (0:00:00.157) 0:06:50.625 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Sunday 18 January 2026 01:51:29 -0500 (0:00:00.190) 0:06:50.815 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Sunday 18 January 2026 01:51:29 -0500 (0:00:00.201) 0:06:51.016 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Sunday 18 January 2026 01:51:30 -0500 (0:00:00.201) 0:06:51.218 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Test for correct handling of new encrypted volume w/ no key - 2] ********* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:197 Sunday 18 January 2026 01:51:30 -0500 (0:00:00.255) 0:06:51.473 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node9 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Sunday 18 January 2026 01:51:31 -0500 (0:00:00.757) 0:06:52.231 ******** ok: [managed-node9] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Sunday 18 January 2026 01:51:31 -0500 (0:00:00.478) 0:06:52.709 ******** included: fedora.linux_system_roles.storage for managed-node9 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 18 January 2026 01:51:31 -0500 (0:00:00.414) 0:06:53.124 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 18 January 2026 01:51:32 -0500 (0:00:00.327) 0:06:53.451 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 18 January 2026 01:51:32 -0500 (0:00:00.661) 0:06:54.113 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 18 January 2026 01:51:33 -0500 (0:00:00.922) 0:06:55.035 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 18 January 2026 01:51:34 -0500 (0:00:00.305) 0:06:55.341 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 18 January 2026 01:51:34 -0500 (0:00:00.268) 0:06:55.609 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 18 January 2026 01:51:34 -0500 (0:00:00.296) 0:06:55.905 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 18 January 2026 01:51:34 -0500 (0:00:00.234) 0:06:56.164 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 18 January 2026 01:51:35 -0500 (0:00:00.748) 0:06:56.912 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 18 January 2026 01:51:38 -0500 (0:00:02.393) 0:06:59.306 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 18 January 2026 01:51:38 -0500 (0:00:00.686) 0:06:59.993 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 18 January 2026 01:51:39 -0500 (0:00:00.804) 0:07:00.797 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 18 January 2026 01:51:42 -0500 (0:00:02.517) 0:07:03.315 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 18 January 2026 01:51:42 -0500 (0:00:00.497) 0:07:03.812 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 18 January 2026 01:51:43 -0500 (0:00:00.571) 0:07:04.384 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 18 January 2026 01:51:43 -0500 (0:00:00.559) 0:07:04.943 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 18 January 2026 01:51:44 -0500 (0:00:00.487) 0:07:05.430 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 18 January 2026 01:51:46 -0500 (0:00:02.308) 0:07:07.739 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 18 January 2026 01:51:49 -0500 (0:00:03.031) 0:07:10.771 ******** ok: [managed-node9] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 18 January 2026 01:51:50 -0500 (0:00:00.726) 0:07:11.498 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 18 January 2026 01:51:50 -0500 (0:00:00.148) 0:07:11.646 ******** fatal: [managed-node9]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Sunday 18 January 2026 01:51:53 -0500 (0:00:02.638) 0:07:14.285 ******** fatal: [managed-node9]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "encrypted volume 'test1' missing key/password", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 18 January 2026 01:51:53 -0500 (0:00:00.354) 0:07:14.640 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Sunday 18 January 2026 01:51:53 -0500 (0:00:00.145) 0:07:14.785 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Sunday 18 January 2026 01:51:53 -0500 (0:00:00.196) 0:07:14.982 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Sunday 18 January 2026 01:51:54 -0500 (0:00:00.414) 0:07:15.396 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:216 Sunday 18 January 2026 01:51:54 -0500 (0:00:00.207) 0:07:15.604 ******** included: fedora.linux_system_roles.storage for managed-node9 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 18 January 2026 01:51:55 -0500 (0:00:01.451) 0:07:17.055 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 18 January 2026 01:51:56 -0500 (0:00:00.387) 0:07:17.443 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 18 January 2026 01:51:56 -0500 (0:00:00.605) 0:07:18.048 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 18 January 2026 01:51:57 -0500 (0:00:00.849) 0:07:18.898 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 18 January 2026 01:51:58 -0500 (0:00:00.441) 0:07:19.339 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 18 January 2026 01:51:58 -0500 (0:00:00.275) 0:07:19.615 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 18 January 2026 01:51:58 -0500 (0:00:00.183) 0:07:19.798 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 18 January 2026 01:51:58 -0500 (0:00:00.278) 0:07:20.077 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 18 January 2026 01:51:59 -0500 (0:00:00.726) 0:07:20.804 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 18 January 2026 01:52:01 -0500 (0:00:02.304) 0:07:23.108 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 18 January 2026 01:52:02 -0500 (0:00:00.636) 0:07:23.745 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 18 January 2026 01:52:03 -0500 (0:00:01.283) 0:07:25.029 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 18 January 2026 01:52:06 -0500 (0:00:03.156) 0:07:28.185 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 18 January 2026 01:52:08 -0500 (0:00:01.229) 0:07:29.414 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 18 January 2026 01:52:08 -0500 (0:00:00.687) 0:07:30.102 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 18 January 2026 01:52:09 -0500 (0:00:00.497) 0:07:30.599 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 18 January 2026 01:52:09 -0500 (0:00:00.481) 0:07:31.105 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 18 January 2026 01:52:11 -0500 (0:00:02.075) 0:07:33.181 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 18 January 2026 01:52:17 -0500 (0:00:05.175) 0:07:38.357 ******** ok: [managed-node9] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 18 January 2026 01:52:17 -0500 (0:00:00.701) 0:07:39.059 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 18 January 2026 01:52:18 -0500 (0:00:00.162) 0:07:39.221 ******** changed: [managed-node9] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 18 January 2026 01:52:31 -0500 (0:00:13.122) 0:07:52.344 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 18 January 2026 01:52:31 -0500 (0:00:00.379) 0:07:52.724 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719031.7185187, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "524112b098421226cc4db19eeaf2d31dc9f6f1ce", "ctime": 1768719031.7145188, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 574619848, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1768719031.7145188, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1490, "uid": 0, "version": "1014542963", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 18 January 2026 01:52:32 -0500 (0:00:01.282) 0:07:54.006 ******** ok: [managed-node9] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 18 January 2026 01:52:34 -0500 (0:00:01.392) 0:07:55.399 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 18 January 2026 01:52:34 -0500 (0:00:00.125) 0:07:55.524 ******** ok: [managed-node9] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 18 January 2026 01:52:34 -0500 (0:00:00.342) 0:07:55.867 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 18 January 2026 01:52:34 -0500 (0:00:00.342) 0:07:56.209 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 18 January 2026 01:52:35 -0500 (0:00:00.260) 0:07:56.470 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node9] => (item={'src': '/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 18 January 2026 01:52:36 -0500 (0:00:01.538) 0:07:58.008 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 18 January 2026 01:52:38 -0500 (0:00:01.356) 0:07:59.364 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node9] => (item={'src': '/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 18 January 2026 01:52:39 -0500 (0:00:01.563) 0:08:00.928 ******** skipping: [managed-node9] => (item={'src': '/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 18 January 2026 01:52:40 -0500 (0:00:00.681) 0:08:01.609 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 18 January 2026 01:52:42 -0500 (0:00:02.044) 0:08:03.654 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719044.8705802, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "aef3ce9889a4ea812a1ec156a1b859f16411cc3c", "ctime": 1768719037.2115445, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 557842649, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1768719037.212688, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "1685110160", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 18 January 2026 01:52:43 -0500 (0:00:01.145) 0:08:04.799 ******** changed: [managed-node9] => (item={'backing_device': '/dev/sda', 'name': 'luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node9] => (item={'backing_device': '/dev/sda1', 'name': 'luks-011d559f-bdd6-4d26-94dc-eebd371e9926', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 18 January 2026 01:52:46 -0500 (0:00:02.860) 0:08:07.660 ******** ok: [managed-node9] TASK [Verify role results - 4] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:233 Sunday 18 January 2026 01:52:48 -0500 (0:00:01.796) 0:08:09.457 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node9 TASK [Print out pool information] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Sunday 18 January 2026 01:52:49 -0500 (0:00:00.875) 0:08:10.334 ******** ok: [managed-node9] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Sunday 18 January 2026 01:52:49 -0500 (0:00:00.570) 0:08:10.904 ******** skipping: [managed-node9] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Sunday 18 January 2026 01:52:50 -0500 (0:00:00.487) 0:08:11.392 ******** ok: [managed-node9] => { "changed": false, "info": { "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "size": "4G", "type": "crypt", "uuid": "3584d2c7-a6f2-44d8-823c-c558f9e81745" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "011d559f-bdd6-4d26-94dc-eebd371e9926" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "a072e5ad-b1ec-4900-abeb-5d4679ecfcd5" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Sunday 18 January 2026 01:52:51 -0500 (0:00:01.305) 0:08:12.698 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002952", "end": "2026-01-18 01:52:52.552971", "rc": 0, "start": "2026-01-18 01:52:52.550019" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Mon Jan 5 14:46:37 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=a072e5ad-b1ec-4900-abeb-5d4679ecfcd5 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Sunday 18 January 2026 01:52:52 -0500 (0:00:01.220) 0:08:13.918 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003077", "end": "2026-01-18 01:52:53.683215", "failed_when_result": false, "rc": 0, "start": "2026-01-18 01:52:53.680138" } STDOUT: luks-011d559f-bdd6-4d26-94dc-eebd371e9926 /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Sunday 18 January 2026 01:52:53 -0500 (0:00:01.136) 0:08:15.055 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node9 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Sunday 18 January 2026 01:52:54 -0500 (0:00:00.869) 0:08:15.925 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Sunday 18 January 2026 01:52:54 -0500 (0:00:00.193) 0:08:16.119 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Sunday 18 January 2026 01:52:55 -0500 (0:00:00.254) 0:08:16.373 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Sunday 18 January 2026 01:52:55 -0500 (0:00:00.188) 0:08:16.561 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node9 => (item=members) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node9 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Sunday 18 January 2026 01:52:55 -0500 (0:00:00.595) 0:08:17.157 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Sunday 18 January 2026 01:52:56 -0500 (0:00:00.201) 0:08:17.358 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Sunday 18 January 2026 01:52:56 -0500 (0:00:00.250) 0:08:17.608 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Sunday 18 January 2026 01:52:56 -0500 (0:00:00.186) 0:08:17.794 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Sunday 18 January 2026 01:52:56 -0500 (0:00:00.190) 0:08:17.985 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Sunday 18 January 2026 01:52:56 -0500 (0:00:00.178) 0:08:18.164 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Sunday 18 January 2026 01:52:57 -0500 (0:00:00.200) 0:08:18.364 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Sunday 18 January 2026 01:52:57 -0500 (0:00:00.192) 0:08:18.557 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Sunday 18 January 2026 01:52:57 -0500 (0:00:00.200) 0:08:18.757 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Sunday 18 January 2026 01:52:57 -0500 (0:00:00.199) 0:08:18.957 ******** ok: [managed-node9] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:92199): WARNING **: 01:52:58.782: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.1 1 Jul 2025 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.176 originally 10.31.45.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.176 originally 10.31.45.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bb59fe80f9' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.176 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Sunday 18 January 2026 01:52:59 -0500 (0:00:01.293) 0:08:20.250 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Sunday 18 January 2026 01:52:59 -0500 (0:00:00.426) 0:08:20.677 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node9 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Sunday 18 January 2026 01:53:00 -0500 (0:00:00.722) 0:08:21.399 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Sunday 18 January 2026 01:53:00 -0500 (0:00:00.156) 0:08:21.556 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Sunday 18 January 2026 01:53:00 -0500 (0:00:00.201) 0:08:21.757 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Sunday 18 January 2026 01:53:00 -0500 (0:00:00.257) 0:08:22.015 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Sunday 18 January 2026 01:53:00 -0500 (0:00:00.181) 0:08:22.197 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Sunday 18 January 2026 01:53:01 -0500 (0:00:00.223) 0:08:22.421 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Sunday 18 January 2026 01:53:01 -0500 (0:00:00.240) 0:08:22.661 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Sunday 18 January 2026 01:53:01 -0500 (0:00:00.180) 0:08:22.842 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Sunday 18 January 2026 01:53:01 -0500 (0:00:00.226) 0:08:23.068 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Sunday 18 January 2026 01:53:02 -0500 (0:00:00.187) 0:08:23.256 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Sunday 18 January 2026 01:53:02 -0500 (0:00:00.208) 0:08:23.464 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Sunday 18 January 2026 01:53:02 -0500 (0:00:00.231) 0:08:23.695 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node9 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Sunday 18 January 2026 01:53:03 -0500 (0:00:00.696) 0:08:24.392 ******** skipping: [managed-node9] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Sunday 18 January 2026 01:53:03 -0500 (0:00:00.313) 0:08:24.706 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node9 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Sunday 18 January 2026 01:53:03 -0500 (0:00:00.444) 0:08:25.151 ******** skipping: [managed-node9] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Sunday 18 January 2026 01:53:04 -0500 (0:00:00.271) 0:08:25.423 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node9 TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Sunday 18 January 2026 01:53:04 -0500 (0:00:00.578) 0:08:26.001 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Sunday 18 January 2026 01:53:05 -0500 (0:00:00.503) 0:08:26.505 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Sunday 18 January 2026 01:53:05 -0500 (0:00:00.183) 0:08:26.689 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Sunday 18 January 2026 01:53:05 -0500 (0:00:00.200) 0:08:26.889 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Sunday 18 January 2026 01:53:05 -0500 (0:00:00.177) 0:08:27.067 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node9 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Sunday 18 January 2026 01:53:06 -0500 (0:00:00.795) 0:08:27.863 ******** skipping: [managed-node9] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Sunday 18 January 2026 01:53:07 -0500 (0:00:00.366) 0:08:28.229 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node9 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Sunday 18 January 2026 01:53:07 -0500 (0:00:00.605) 0:08:28.835 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Sunday 18 January 2026 01:53:07 -0500 (0:00:00.177) 0:08:29.013 ******** skipping: [managed-node9] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Sunday 18 January 2026 01:53:07 -0500 (0:00:00.169) 0:08:29.183 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Sunday 18 January 2026 01:53:08 -0500 (0:00:00.149) 0:08:29.333 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Sunday 18 January 2026 01:53:08 -0500 (0:00:00.165) 0:08:29.498 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Sunday 18 January 2026 01:53:08 -0500 (0:00:00.271) 0:08:29.770 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Sunday 18 January 2026 01:53:08 -0500 (0:00:00.171) 0:08:29.942 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Sunday 18 January 2026 01:53:08 -0500 (0:00:00.205) 0:08:30.147 ******** ok: [managed-node9] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Sunday 18 January 2026 01:53:09 -0500 (0:00:00.185) 0:08:30.333 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node9 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Sunday 18 January 2026 01:53:09 -0500 (0:00:00.336) 0:08:30.669 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Sunday 18 January 2026 01:53:09 -0500 (0:00:00.520) 0:08:31.189 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node9 => (item=mount) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node9 => (item=fstab) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node9 => (item=fs) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node9 => (item=device) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node9 => (item=encryption) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node9 => (item=md) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node9 => (item=size) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node9 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Sunday 18 January 2026 01:53:11 -0500 (0:00:01.953) 0:08:33.143 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Sunday 18 January 2026 01:53:12 -0500 (0:00:00.351) 0:08:33.495 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Sunday 18 January 2026 01:53:12 -0500 (0:00:00.607) 0:08:34.102 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Sunday 18 January 2026 01:53:13 -0500 (0:00:00.519) 0:08:34.622 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Sunday 18 January 2026 01:53:13 -0500 (0:00:00.221) 0:08:34.843 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Sunday 18 January 2026 01:53:14 -0500 (0:00:00.672) 0:08:35.515 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Sunday 18 January 2026 01:53:15 -0500 (0:00:01.464) 0:08:36.980 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Sunday 18 January 2026 01:53:17 -0500 (0:00:01.336) 0:08:38.316 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Sunday 18 January 2026 01:53:18 -0500 (0:00:00.894) 0:08:39.211 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Sunday 18 January 2026 01:53:18 -0500 (0:00:00.177) 0:08:39.389 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Sunday 18 January 2026 01:53:18 -0500 (0:00:00.202) 0:08:39.592 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Sunday 18 January 2026 01:53:18 -0500 (0:00:00.162) 0:08:39.754 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Sunday 18 January 2026 01:53:19 -0500 (0:00:01.072) 0:08:40.827 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Sunday 18 January 2026 01:53:20 -0500 (0:00:00.442) 0:08:41.269 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Sunday 18 January 2026 01:53:20 -0500 (0:00:00.610) 0:08:41.879 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Sunday 18 January 2026 01:53:21 -0500 (0:00:00.556) 0:08:42.436 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Sunday 18 January 2026 01:53:21 -0500 (0:00:00.667) 0:08:43.104 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Sunday 18 January 2026 01:53:22 -0500 (0:00:00.298) 0:08:43.402 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Sunday 18 January 2026 01:53:22 -0500 (0:00:00.806) 0:08:44.209 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Sunday 18 January 2026 01:53:23 -0500 (0:00:00.778) 0:08:44.988 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719150.3960736, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1768719150.3960736, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1258, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1768719150.3960736, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Sunday 18 January 2026 01:53:25 -0500 (0:00:01.255) 0:08:46.243 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Sunday 18 January 2026 01:53:25 -0500 (0:00:00.288) 0:08:46.532 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Sunday 18 January 2026 01:53:25 -0500 (0:00:00.214) 0:08:46.747 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Sunday 18 January 2026 01:53:25 -0500 (0:00:00.442) 0:08:47.190 ******** ok: [managed-node9] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Sunday 18 January 2026 01:53:26 -0500 (0:00:00.287) 0:08:47.478 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Sunday 18 January 2026 01:53:26 -0500 (0:00:00.181) 0:08:47.660 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Sunday 18 January 2026 01:53:26 -0500 (0:00:00.302) 0:08:47.962 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719150.8780758, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1768719150.8780758, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1327, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1768719150.8780758, "nlink": 1, "path": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Sunday 18 January 2026 01:53:27 -0500 (0:00:01.153) 0:08:49.116 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Sunday 18 January 2026 01:53:30 -0500 (0:00:02.141) 0:08:51.257 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.007202", "end": "2026-01-18 01:53:31.021560", "rc": 0, "start": "2026-01-18 01:53:31.014358" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 011d559f-bdd6-4d26-94dc-eebd371e9926 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 668008 Threads: 2 Salt: 82 b2 a9 90 29 3b a7 13 88 b1 a8 a3 f2 68 8d a9 2d 5c 71 10 87 d5 e5 c6 71 18 63 24 4c 72 89 8d AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 133338 Salt: 9b ef 2f 70 8a 9b 84 ff 0c 8a db 48 82 26 f0 06 10 df 94 3c a6 5e b4 36 84 eb 81 6d d2 8a c3 88 Digest: cf 0c 50 f9 76 e1 fb 5a 9d 1a 02 9b de 22 7b c9 ce 7c 7b 75 9a fa be f3 93 f2 d6 3d 9e 53 dd 1a TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Sunday 18 January 2026 01:53:31 -0500 (0:00:01.166) 0:08:52.424 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Sunday 18 January 2026 01:53:31 -0500 (0:00:00.571) 0:08:52.996 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Sunday 18 January 2026 01:53:32 -0500 (0:00:00.821) 0:08:53.817 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Sunday 18 January 2026 01:53:33 -0500 (0:00:00.416) 0:08:54.233 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Sunday 18 January 2026 01:53:33 -0500 (0:00:00.266) 0:08:54.500 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.encryption_luks_version is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Sunday 18 January 2026 01:53:33 -0500 (0:00:00.267) 0:08:54.767 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Sunday 18 January 2026 01:53:33 -0500 (0:00:00.311) 0:08:55.078 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Sunday 18 January 2026 01:53:34 -0500 (0:00:00.298) 0:08:55.377 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-011d559f-bdd6-4d26-94dc-eebd371e9926 /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Sunday 18 January 2026 01:53:34 -0500 (0:00:00.692) 0:08:56.069 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Sunday 18 January 2026 01:53:35 -0500 (0:00:00.695) 0:08:56.765 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Sunday 18 January 2026 01:53:36 -0500 (0:00:00.627) 0:08:57.392 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Sunday 18 January 2026 01:53:36 -0500 (0:00:00.702) 0:08:58.095 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Sunday 18 January 2026 01:53:37 -0500 (0:00:00.757) 0:08:58.853 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Sunday 18 January 2026 01:53:37 -0500 (0:00:00.247) 0:08:59.100 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Sunday 18 January 2026 01:53:38 -0500 (0:00:00.185) 0:08:59.285 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Sunday 18 January 2026 01:53:38 -0500 (0:00:00.270) 0:08:59.556 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Sunday 18 January 2026 01:53:38 -0500 (0:00:00.228) 0:08:59.785 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Sunday 18 January 2026 01:53:38 -0500 (0:00:00.227) 0:09:00.013 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Sunday 18 January 2026 01:53:39 -0500 (0:00:00.249) 0:09:00.263 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Sunday 18 January 2026 01:53:39 -0500 (0:00:00.218) 0:09:00.481 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Sunday 18 January 2026 01:53:39 -0500 (0:00:00.221) 0:09:00.703 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Sunday 18 January 2026 01:53:39 -0500 (0:00:00.274) 0:09:00.978 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Sunday 18 January 2026 01:53:40 -0500 (0:00:00.271) 0:09:01.250 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Sunday 18 January 2026 01:53:40 -0500 (0:00:00.218) 0:09:01.468 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Sunday 18 January 2026 01:53:40 -0500 (0:00:00.475) 0:09:01.943 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Sunday 18 January 2026 01:53:41 -0500 (0:00:00.538) 0:09:02.482 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Sunday 18 January 2026 01:53:41 -0500 (0:00:00.626) 0:09:03.108 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Sunday 18 January 2026 01:53:42 -0500 (0:00:00.368) 0:09:03.477 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Sunday 18 January 2026 01:53:42 -0500 (0:00:00.538) 0:09:04.016 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Sunday 18 January 2026 01:53:43 -0500 (0:00:00.565) 0:09:04.581 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Sunday 18 January 2026 01:53:43 -0500 (0:00:00.463) 0:09:05.045 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Sunday 18 January 2026 01:53:44 -0500 (0:00:00.379) 0:09:05.425 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Sunday 18 January 2026 01:53:44 -0500 (0:00:00.300) 0:09:05.726 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Sunday 18 January 2026 01:53:44 -0500 (0:00:00.257) 0:09:05.984 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Sunday 18 January 2026 01:53:45 -0500 (0:00:00.607) 0:09:06.591 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Sunday 18 January 2026 01:53:45 -0500 (0:00:00.383) 0:09:06.975 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Sunday 18 January 2026 01:53:46 -0500 (0:00:00.620) 0:09:07.596 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Sunday 18 January 2026 01:53:47 -0500 (0:00:00.627) 0:09:08.224 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Sunday 18 January 2026 01:53:47 -0500 (0:00:00.696) 0:09:08.920 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Sunday 18 January 2026 01:53:48 -0500 (0:00:00.614) 0:09:09.534 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Sunday 18 January 2026 01:53:48 -0500 (0:00:00.658) 0:09:10.193 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Sunday 18 January 2026 01:53:49 -0500 (0:00:00.578) 0:09:10.772 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Sunday 18 January 2026 01:53:50 -0500 (0:00:00.758) 0:09:11.530 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Sunday 18 January 2026 01:53:50 -0500 (0:00:00.532) 0:09:12.062 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Sunday 18 January 2026 01:53:51 -0500 (0:00:00.590) 0:09:12.652 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Sunday 18 January 2026 01:53:51 -0500 (0:00:00.414) 0:09:13.067 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Sunday 18 January 2026 01:53:52 -0500 (0:00:00.639) 0:09:13.707 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Sunday 18 January 2026 01:53:53 -0500 (0:00:00.683) 0:09:14.390 ******** ok: [managed-node9] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Sunday 18 January 2026 01:53:53 -0500 (0:00:00.306) 0:09:14.697 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Sunday 18 January 2026 01:53:53 -0500 (0:00:00.294) 0:09:14.992 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Sunday 18 January 2026 01:53:54 -0500 (0:00:00.594) 0:09:15.586 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Sunday 18 January 2026 01:53:54 -0500 (0:00:00.274) 0:09:15.860 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Sunday 18 January 2026 01:53:54 -0500 (0:00:00.270) 0:09:16.131 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Sunday 18 January 2026 01:53:55 -0500 (0:00:00.208) 0:09:16.355 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Sunday 18 January 2026 01:53:55 -0500 (0:00:00.223) 0:09:16.578 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Sunday 18 January 2026 01:53:55 -0500 (0:00:00.182) 0:09:16.760 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Sunday 18 January 2026 01:53:55 -0500 (0:00:00.205) 0:09:17.000 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Sunday 18 January 2026 01:53:56 -0500 (0:00:00.209) 0:09:17.210 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Sunday 18 January 2026 01:53:56 -0500 (0:00:00.126) 0:09:17.336 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Sunday 18 January 2026 01:53:56 -0500 (0:00:00.470) 0:09:17.807 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Sunday 18 January 2026 01:53:56 -0500 (0:00:00.183) 0:09:17.990 ******** changed: [managed-node9] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 3] ****************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:239 Sunday 18 January 2026 01:53:57 -0500 (0:00:01.162) 0:09:19.153 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node9 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Sunday 18 January 2026 01:53:58 -0500 (0:00:00.964) 0:09:20.117 ******** ok: [managed-node9] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Sunday 18 January 2026 01:53:59 -0500 (0:00:00.645) 0:09:20.762 ******** included: fedora.linux_system_roles.storage for managed-node9 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 18 January 2026 01:54:00 -0500 (0:00:00.480) 0:09:21.243 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 18 January 2026 01:54:00 -0500 (0:00:00.419) 0:09:21.662 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 18 January 2026 01:54:01 -0500 (0:00:00.704) 0:09:22.367 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 18 January 2026 01:54:02 -0500 (0:00:00.890) 0:09:23.257 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 18 January 2026 01:54:02 -0500 (0:00:00.304) 0:09:23.562 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 18 January 2026 01:54:02 -0500 (0:00:00.288) 0:09:23.850 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 18 January 2026 01:54:02 -0500 (0:00:00.327) 0:09:24.178 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 18 January 2026 01:54:03 -0500 (0:00:00.182) 0:09:24.361 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 18 January 2026 01:54:03 -0500 (0:00:00.710) 0:09:25.072 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 18 January 2026 01:54:06 -0500 (0:00:02.389) 0:09:27.461 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 18 January 2026 01:54:06 -0500 (0:00:00.680) 0:09:28.141 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 18 January 2026 01:54:07 -0500 (0:00:00.752) 0:09:28.893 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 18 January 2026 01:54:10 -0500 (0:00:02.773) 0:09:31.666 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 18 January 2026 01:54:10 -0500 (0:00:00.440) 0:09:32.107 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 18 January 2026 01:54:11 -0500 (0:00:00.532) 0:09:32.639 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 18 January 2026 01:54:12 -0500 (0:00:00.608) 0:09:33.248 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 18 January 2026 01:54:12 -0500 (0:00:00.576) 0:09:33.825 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 18 January 2026 01:54:14 -0500 (0:00:02.299) 0:09:36.124 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d4eb8a886\\x2d6a8c\\x2d44cf\\x2db518\\x2d9f5a2406116d.service": { "name": "systemd-cryptsetup@luks\\x2d4eb8a886\\x2d6a8c\\x2d44cf\\x2db518\\x2d9f5a2406116d.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 18 January 2026 01:54:18 -0500 (0:00:03.267) 0:09:39.391 ******** ok: [managed-node9] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d4eb8a886\\x2d6a8c\\x2d44cf\\x2db518\\x2d9f5a2406116d.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 18 January 2026 01:54:19 -0500 (0:00:00.961) 0:09:40.352 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2d4eb8a886\x2d6a8c\x2d44cf\x2db518\x2d9f5a2406116d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4eb8a886\\x2d6a8c\\x2d44cf\\x2db518\\x2d9f5a2406116d.service", "name": "systemd-cryptsetup@luks\\x2d4eb8a886\\x2d6a8c\\x2d44cf\\x2db518\\x2d9f5a2406116d.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-udevd-kernel.socket dev-sda.device \"system-systemd\\\\x2dcryptsetup.slice\" cryptsetup-pre.target systemd-journald.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target \"blockdev@dev-mapper-luks\\\\x2d4eb8a886\\\\x2d6a8c\\\\x2d44cf\\\\x2db518\\\\x2d9f5a2406116d.target\" cryptsetup.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d /dev/sda - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-4eb8a886-6a8c-44cf-b518-9f5a2406116d ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d4eb8a886\\x2d6a8c\\x2d44cf\\x2db518\\x2d9f5a2406116d.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d4eb8a886\\x2d6a8c\\x2d44cf\\x2db518\\x2d9f5a2406116d.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d4eb8a886\\\\x2d6a8c\\\\x2d44cf\\\\x2db518\\\\x2d9f5a2406116d.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2026-01-18 01:52:42 EST", "StateChangeTimestampMonotonic": "10740607175", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d4eb8a886\\\\x2d6a8c\\\\x2d44cf\\\\x2db518\\\\x2d9f5a2406116d.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 18 January 2026 01:54:20 -0500 (0:00:01.716) 0:09:42.069 ******** fatal: [managed-node9]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-011d559f-bdd6-4d26-94dc-eebd371e9926' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Sunday 18 January 2026 01:54:23 -0500 (0:00:02.724) 0:09:44.793 ******** fatal: [managed-node9]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'luks-011d559f-bdd6-4d26-94dc-eebd371e9926' in safe mode due to encryption removal", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 18 January 2026 01:54:23 -0500 (0:00:00.386) 0:09:45.180 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2d4eb8a886\x2d6a8c\x2d44cf\x2db518\x2d9f5a2406116d.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4eb8a886\\x2d6a8c\\x2d44cf\\x2db518\\x2d9f5a2406116d.service", "name": "systemd-cryptsetup@luks\\x2d4eb8a886\\x2d6a8c\\x2d44cf\\x2db518\\x2d9f5a2406116d.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d4eb8a886\\x2d6a8c\\x2d44cf\\x2db518\\x2d9f5a2406116d.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d4eb8a886\\x2d6a8c\\x2d44cf\\x2db518\\x2d9f5a2406116d.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d4eb8a886\\x2d6a8c\\x2d44cf\\x2db518\\x2d9f5a2406116d.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454172672", "LimitMEMLOCKSoft": "454172672", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d4eb8a886\\x2d6a8c\\x2d44cf\\x2db518\\x2d9f5a2406116d.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d4eb8a886\\\\x2d6a8c\\\\x2d44cf\\\\x2db518\\\\x2d9f5a2406116d.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Sunday 18 January 2026 01:54:25 -0500 (0:00:01.552) 0:09:46.732 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Sunday 18 January 2026 01:54:25 -0500 (0:00:00.312) 0:09:47.045 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Sunday 18 January 2026 01:54:26 -0500 (0:00:00.455) 0:09:47.500 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Sunday 18 January 2026 01:54:26 -0500 (0:00:00.287) 0:09:47.788 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719237.765482, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1768719237.765482, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1768719237.765482, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1129162259", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Sunday 18 January 2026 01:54:27 -0500 (0:00:01.195) 0:09:48.988 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 2] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:263 Sunday 18 January 2026 01:54:28 -0500 (0:00:00.400) 0:09:49.389 ******** included: fedora.linux_system_roles.storage for managed-node9 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 18 January 2026 01:54:29 -0500 (0:00:01.600) 0:09:50.990 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 18 January 2026 01:54:31 -0500 (0:00:01.319) 0:09:52.310 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 18 January 2026 01:54:31 -0500 (0:00:00.543) 0:09:52.854 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 18 January 2026 01:54:32 -0500 (0:00:00.791) 0:09:53.645 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 18 January 2026 01:54:32 -0500 (0:00:00.289) 0:09:53.935 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 18 January 2026 01:54:33 -0500 (0:00:00.350) 0:09:54.285 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 18 January 2026 01:54:33 -0500 (0:00:00.265) 0:09:54.551 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 18 January 2026 01:54:33 -0500 (0:00:00.304) 0:09:54.855 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 18 January 2026 01:54:34 -0500 (0:00:00.814) 0:09:55.670 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 18 January 2026 01:54:36 -0500 (0:00:02.031) 0:09:57.701 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 18 January 2026 01:54:37 -0500 (0:00:00.736) 0:09:58.437 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 18 January 2026 01:54:37 -0500 (0:00:00.544) 0:09:58.982 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 18 January 2026 01:54:40 -0500 (0:00:02.700) 0:10:01.682 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 18 January 2026 01:54:40 -0500 (0:00:00.464) 0:10:02.147 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 18 January 2026 01:54:41 -0500 (0:00:00.506) 0:10:02.654 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 18 January 2026 01:54:41 -0500 (0:00:00.510) 0:10:03.164 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 18 January 2026 01:54:42 -0500 (0:00:00.578) 0:10:03.742 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 18 January 2026 01:54:44 -0500 (0:00:02.435) 0:10:06.177 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service": { "name": "systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 18 January 2026 01:54:51 -0500 (0:00:06.135) 0:10:12.313 ******** ok: [managed-node9] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 18 January 2026 01:54:51 -0500 (0:00:00.775) 0:10:13.089 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2d011d559f\x2dbdd6\x2d4d26\x2d94dc\x2deebd371e9926.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service", "name": "systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target dev-sda1.device \"system-systemd\\\\x2dcryptsetup.slice\" systemd-journald.socket systemd-udevd-kernel.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "\"blockdev@dev-mapper-luks\\\\x2d011d559f\\\\x2dbdd6\\\\x2d4d26\\\\x2d94dc\\\\x2deebd371e9926.target\" cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-011d559f-bdd6-4d26-94dc-eebd371e9926 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-011d559f-bdd6-4d26-94dc-eebd371e9926 /dev/sda1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-011d559f-bdd6-4d26-94dc-eebd371e9926 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-011d559f-bdd6-4d26-94dc-eebd371e9926 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d011d559f\\\\x2dbdd6\\\\x2d4d26\\\\x2d94dc\\\\x2deebd371e9926.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "\"dev-mapper-luks\\\\x2d011d559f\\\\x2dbdd6\\\\x2d4d26\\\\x2d94dc\\\\x2deebd371e9926.device\" cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2026-01-18 01:54:25 EST", "StateChangeTimestampMonotonic": "10843692085", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d011d559f\\\\x2dbdd6\\\\x2d4d26\\\\x2d94dc\\\\x2deebd371e9926.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 18 January 2026 01:54:53 -0500 (0:00:01.623) 0:10:14.713 ******** changed: [managed-node9] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 18 January 2026 01:54:56 -0500 (0:00:03.320) 0:10:18.034 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 18 January 2026 01:54:57 -0500 (0:00:00.581) 0:10:18.615 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719159.5421164, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "eda7ba62da54f1320a3bd974777156657d0af444", "ctime": 1768719159.5381165, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 574619848, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1768719159.5381165, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1490, "uid": 0, "version": "1014542963", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 18 January 2026 01:54:58 -0500 (0:00:01.385) 0:10:20.001 ******** ok: [managed-node9] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 18 January 2026 01:55:00 -0500 (0:00:01.389) 0:10:21.409 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2d011d559f\x2dbdd6\x2d4d26\x2d94dc\x2deebd371e9926.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service", "name": "systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454172672", "LimitMEMLOCKSoft": "454172672", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d011d559f\\\\x2dbdd6\\\\x2d4d26\\\\x2d94dc\\\\x2deebd371e9926.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target \"dev-mapper-luks\\\\x2d011d559f\\\\x2dbdd6\\\\x2d4d26\\\\x2d94dc\\\\x2deebd371e9926.device\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2026-01-18 01:54:25 EST", "StateChangeTimestampMonotonic": "10843692085", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 18 January 2026 01:55:01 -0500 (0:00:01.769) 0:10:23.179 ******** ok: [managed-node9] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 18 January 2026 01:55:02 -0500 (0:00:00.411) 0:10:23.590 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 18 January 2026 01:55:02 -0500 (0:00:00.352) 0:10:23.963 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 18 January 2026 01:55:03 -0500 (0:00:00.308) 0:10:24.272 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node9] => (item={'src': '/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-011d559f-bdd6-4d26-94dc-eebd371e9926" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 18 January 2026 01:55:05 -0500 (0:00:01.974) 0:10:26.246 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 18 January 2026 01:55:07 -0500 (0:00:02.123) 0:10:28.370 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node9] => (item={'src': 'UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 18 January 2026 01:55:09 -0500 (0:00:01.886) 0:10:30.257 ******** skipping: [managed-node9] => (item={'src': 'UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 18 January 2026 01:55:09 -0500 (0:00:00.861) 0:10:31.118 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 18 January 2026 01:55:11 -0500 (0:00:01.903) 0:10:33.021 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719173.6821826, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ab0ead4a7525b62c8f6ec041dcd61dcc88314777", "ctime": 1768719166.2241476, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 88080585, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1768719166.2251828, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "3343465345", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 18 January 2026 01:55:13 -0500 (0:00:01.337) 0:10:34.358 ******** changed: [managed-node9] => (item={'backing_device': '/dev/sda1', 'name': 'luks-011d559f-bdd6-4d26-94dc-eebd371e9926', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 18 January 2026 01:55:14 -0500 (0:00:01.758) 0:10:36.117 ******** ok: [managed-node9] TASK [Verify role results - 5] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:280 Sunday 18 January 2026 01:55:16 -0500 (0:00:02.032) 0:10:38.150 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node9 TASK [Print out pool information] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Sunday 18 January 2026 01:55:17 -0500 (0:00:01.014) 0:10:39.164 ******** ok: [managed-node9] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Sunday 18 January 2026 01:55:18 -0500 (0:00:00.737) 0:10:39.902 ******** skipping: [managed-node9] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Sunday 18 January 2026 01:55:19 -0500 (0:00:00.590) 0:10:40.493 ******** ok: [managed-node9] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "156fa78f-9b9a-4ca9-8a54-9fb077a20671" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "a072e5ad-b1ec-4900-abeb-5d4679ecfcd5" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Sunday 18 January 2026 01:55:20 -0500 (0:00:01.210) 0:10:41.704 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002969", "end": "2026-01-18 01:55:21.524791", "rc": 0, "start": "2026-01-18 01:55:21.521822" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Mon Jan 5 14:46:37 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=a072e5ad-b1ec-4900-abeb-5d4679ecfcd5 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Sunday 18 January 2026 01:55:21 -0500 (0:00:01.193) 0:10:42.897 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003087", "end": "2026-01-18 01:55:22.741551", "failed_when_result": false, "rc": 0, "start": "2026-01-18 01:55:22.738464" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Sunday 18 January 2026 01:55:22 -0500 (0:00:01.244) 0:10:44.142 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node9 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Sunday 18 January 2026 01:55:23 -0500 (0:00:00.612) 0:10:44.755 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Sunday 18 January 2026 01:55:23 -0500 (0:00:00.251) 0:10:45.007 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Sunday 18 January 2026 01:55:24 -0500 (0:00:00.216) 0:10:45.223 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Sunday 18 January 2026 01:55:24 -0500 (0:00:00.285) 0:10:45.509 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node9 => (item=members) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node9 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Sunday 18 January 2026 01:55:24 -0500 (0:00:00.565) 0:10:46.074 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Sunday 18 January 2026 01:55:25 -0500 (0:00:00.247) 0:10:46.322 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Sunday 18 January 2026 01:55:25 -0500 (0:00:00.176) 0:10:46.498 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Sunday 18 January 2026 01:55:25 -0500 (0:00:00.151) 0:10:46.649 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Sunday 18 January 2026 01:55:25 -0500 (0:00:00.230) 0:10:46.879 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Sunday 18 January 2026 01:55:25 -0500 (0:00:00.278) 0:10:47.158 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Sunday 18 January 2026 01:55:26 -0500 (0:00:00.320) 0:10:47.479 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Sunday 18 January 2026 01:55:26 -0500 (0:00:00.293) 0:10:47.772 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Sunday 18 January 2026 01:55:26 -0500 (0:00:00.281) 0:10:48.054 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Sunday 18 January 2026 01:55:27 -0500 (0:00:00.195) 0:10:48.250 ******** ok: [managed-node9] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:97823): WARNING **: 01:55:28.121: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.1 1 Jul 2025 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.176 originally 10.31.45.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.176 originally 10.31.45.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bb59fe80f9' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.176 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Sunday 18 January 2026 01:55:28 -0500 (0:00:01.303) 0:10:49.553 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Sunday 18 January 2026 01:55:28 -0500 (0:00:00.496) 0:10:50.050 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node9 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Sunday 18 January 2026 01:55:29 -0500 (0:00:00.715) 0:10:50.765 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Sunday 18 January 2026 01:55:29 -0500 (0:00:00.256) 0:10:51.022 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Sunday 18 January 2026 01:55:30 -0500 (0:00:00.225) 0:10:51.247 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Sunday 18 January 2026 01:55:30 -0500 (0:00:00.184) 0:10:51.432 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Sunday 18 January 2026 01:55:30 -0500 (0:00:00.179) 0:10:51.612 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Sunday 18 January 2026 01:55:30 -0500 (0:00:00.165) 0:10:51.778 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Sunday 18 January 2026 01:55:30 -0500 (0:00:00.210) 0:10:51.988 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Sunday 18 January 2026 01:55:30 -0500 (0:00:00.191) 0:10:52.180 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Sunday 18 January 2026 01:55:31 -0500 (0:00:00.167) 0:10:52.347 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Sunday 18 January 2026 01:55:31 -0500 (0:00:00.202) 0:10:52.550 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Sunday 18 January 2026 01:55:31 -0500 (0:00:00.218) 0:10:52.768 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Sunday 18 January 2026 01:55:31 -0500 (0:00:00.259) 0:10:53.027 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node9 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Sunday 18 January 2026 01:55:32 -0500 (0:00:00.485) 0:10:53.512 ******** skipping: [managed-node9] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Sunday 18 January 2026 01:55:32 -0500 (0:00:00.260) 0:10:53.773 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node9 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Sunday 18 January 2026 01:55:33 -0500 (0:00:00.439) 0:10:54.212 ******** skipping: [managed-node9] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Sunday 18 January 2026 01:55:33 -0500 (0:00:00.257) 0:10:54.469 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node9 TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Sunday 18 January 2026 01:55:33 -0500 (0:00:00.554) 0:10:55.024 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Sunday 18 January 2026 01:55:34 -0500 (0:00:00.708) 0:10:55.733 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Sunday 18 January 2026 01:55:34 -0500 (0:00:00.198) 0:10:55.931 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Sunday 18 January 2026 01:55:34 -0500 (0:00:00.169) 0:10:56.100 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Sunday 18 January 2026 01:55:35 -0500 (0:00:00.304) 0:10:56.405 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node9 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Sunday 18 January 2026 01:55:35 -0500 (0:00:00.672) 0:10:57.077 ******** skipping: [managed-node9] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Sunday 18 January 2026 01:55:36 -0500 (0:00:00.281) 0:10:57.359 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node9 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Sunday 18 January 2026 01:55:36 -0500 (0:00:00.740) 0:10:58.100 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Sunday 18 January 2026 01:55:37 -0500 (0:00:00.181) 0:10:58.281 ******** skipping: [managed-node9] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Sunday 18 January 2026 01:55:37 -0500 (0:00:00.219) 0:10:58.501 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Sunday 18 January 2026 01:55:37 -0500 (0:00:00.228) 0:10:58.729 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Sunday 18 January 2026 01:55:37 -0500 (0:00:00.228) 0:10:58.957 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Sunday 18 January 2026 01:55:37 -0500 (0:00:00.167) 0:10:59.125 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Sunday 18 January 2026 01:55:38 -0500 (0:00:00.150) 0:10:59.276 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Sunday 18 January 2026 01:55:38 -0500 (0:00:00.203) 0:10:59.495 ******** ok: [managed-node9] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Sunday 18 January 2026 01:55:38 -0500 (0:00:00.195) 0:10:59.691 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node9 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Sunday 18 January 2026 01:55:38 -0500 (0:00:00.310) 0:11:00.001 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Sunday 18 January 2026 01:55:39 -0500 (0:00:00.492) 0:11:00.493 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node9 => (item=mount) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node9 => (item=fstab) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node9 => (item=fs) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node9 => (item=device) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node9 => (item=encryption) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node9 => (item=md) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node9 => (item=size) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node9 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Sunday 18 January 2026 01:55:41 -0500 (0:00:02.470) 0:11:02.964 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Sunday 18 January 2026 01:55:42 -0500 (0:00:00.342) 0:11:03.306 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Sunday 18 January 2026 01:55:42 -0500 (0:00:00.644) 0:11:03.950 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Sunday 18 January 2026 01:55:43 -0500 (0:00:00.621) 0:11:04.572 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Sunday 18 January 2026 01:55:43 -0500 (0:00:00.311) 0:11:04.883 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Sunday 18 January 2026 01:55:44 -0500 (0:00:00.706) 0:11:05.590 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Sunday 18 January 2026 01:55:45 -0500 (0:00:00.681) 0:11:06.271 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Sunday 18 January 2026 01:55:45 -0500 (0:00:00.646) 0:11:06.917 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Sunday 18 January 2026 01:55:45 -0500 (0:00:00.257) 0:11:07.175 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Sunday 18 January 2026 01:55:46 -0500 (0:00:00.222) 0:11:07.398 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Sunday 18 January 2026 01:55:46 -0500 (0:00:00.187) 0:11:07.585 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Sunday 18 January 2026 01:55:46 -0500 (0:00:00.256) 0:11:07.841 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Sunday 18 January 2026 01:55:47 -0500 (0:00:00.945) 0:11:08.787 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Sunday 18 January 2026 01:55:48 -0500 (0:00:00.634) 0:11:09.421 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Sunday 18 January 2026 01:55:48 -0500 (0:00:00.723) 0:11:10.145 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Sunday 18 January 2026 01:55:49 -0500 (0:00:00.527) 0:11:10.673 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Sunday 18 January 2026 01:55:50 -0500 (0:00:00.602) 0:11:11.275 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Sunday 18 January 2026 01:55:50 -0500 (0:00:00.367) 0:11:11.643 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Sunday 18 January 2026 01:55:51 -0500 (0:00:00.639) 0:11:12.282 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Sunday 18 January 2026 01:55:51 -0500 (0:00:00.690) 0:11:12.997 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719296.5297568, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1768719296.5297568, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1527, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1768719296.5297568, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Sunday 18 January 2026 01:55:52 -0500 (0:00:01.180) 0:11:14.178 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Sunday 18 January 2026 01:55:53 -0500 (0:00:00.335) 0:11:14.514 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Sunday 18 January 2026 01:55:53 -0500 (0:00:00.235) 0:11:14.749 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Sunday 18 January 2026 01:55:53 -0500 (0:00:00.334) 0:11:15.084 ******** ok: [managed-node9] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Sunday 18 January 2026 01:55:54 -0500 (0:00:00.278) 0:11:15.362 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Sunday 18 January 2026 01:55:54 -0500 (0:00:00.209) 0:11:15.572 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Sunday 18 January 2026 01:55:54 -0500 (0:00:00.296) 0:11:15.869 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Sunday 18 January 2026 01:55:54 -0500 (0:00:00.233) 0:11:16.102 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Sunday 18 January 2026 01:55:57 -0500 (0:00:02.150) 0:11:18.253 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Sunday 18 January 2026 01:55:57 -0500 (0:00:00.242) 0:11:18.495 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Sunday 18 January 2026 01:55:57 -0500 (0:00:00.222) 0:11:18.717 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Sunday 18 January 2026 01:55:58 -0500 (0:00:00.602) 0:11:19.320 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Sunday 18 January 2026 01:55:58 -0500 (0:00:00.264) 0:11:19.584 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Sunday 18 January 2026 01:55:58 -0500 (0:00:00.264) 0:11:19.848 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Sunday 18 January 2026 01:55:58 -0500 (0:00:00.240) 0:11:20.089 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Sunday 18 January 2026 01:55:59 -0500 (0:00:00.230) 0:11:20.319 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Sunday 18 January 2026 01:55:59 -0500 (0:00:00.176) 0:11:20.496 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Sunday 18 January 2026 01:55:59 -0500 (0:00:00.573) 0:11:21.069 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Sunday 18 January 2026 01:56:00 -0500 (0:00:00.635) 0:11:21.705 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Sunday 18 January 2026 01:56:01 -0500 (0:00:00.567) 0:11:22.272 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Sunday 18 January 2026 01:56:01 -0500 (0:00:00.628) 0:11:22.900 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Sunday 18 January 2026 01:56:02 -0500 (0:00:00.666) 0:11:23.567 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Sunday 18 January 2026 01:56:02 -0500 (0:00:00.212) 0:11:23.779 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Sunday 18 January 2026 01:56:02 -0500 (0:00:00.196) 0:11:23.976 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Sunday 18 January 2026 01:56:02 -0500 (0:00:00.179) 0:11:24.156 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Sunday 18 January 2026 01:56:03 -0500 (0:00:00.223) 0:11:24.380 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Sunday 18 January 2026 01:56:03 -0500 (0:00:00.263) 0:11:24.643 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Sunday 18 January 2026 01:56:03 -0500 (0:00:00.218) 0:11:24.862 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Sunday 18 January 2026 01:56:03 -0500 (0:00:00.238) 0:11:25.100 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Sunday 18 January 2026 01:56:04 -0500 (0:00:00.192) 0:11:25.292 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Sunday 18 January 2026 01:56:04 -0500 (0:00:00.223) 0:11:25.516 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Sunday 18 January 2026 01:56:04 -0500 (0:00:00.227) 0:11:25.743 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Sunday 18 January 2026 01:56:04 -0500 (0:00:00.217) 0:11:25.960 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Sunday 18 January 2026 01:56:05 -0500 (0:00:00.714) 0:11:26.675 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Sunday 18 January 2026 01:56:06 -0500 (0:00:00.639) 0:11:27.314 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Sunday 18 January 2026 01:56:06 -0500 (0:00:00.647) 0:11:27.962 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Sunday 18 January 2026 01:56:07 -0500 (0:00:00.310) 0:11:28.273 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Sunday 18 January 2026 01:56:07 -0500 (0:00:00.492) 0:11:28.765 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Sunday 18 January 2026 01:56:08 -0500 (0:00:00.638) 0:11:29.404 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Sunday 18 January 2026 01:56:08 -0500 (0:00:00.424) 0:11:29.828 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Sunday 18 January 2026 01:56:09 -0500 (0:00:00.532) 0:11:30.361 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Sunday 18 January 2026 01:56:09 -0500 (0:00:00.528) 0:11:30.889 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Sunday 18 January 2026 01:56:10 -0500 (0:00:00.813) 0:11:31.703 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Sunday 18 January 2026 01:56:11 -0500 (0:00:00.638) 0:11:32.364 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Sunday 18 January 2026 01:56:11 -0500 (0:00:00.685) 0:11:33.050 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Sunday 18 January 2026 01:56:12 -0500 (0:00:00.678) 0:11:33.728 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Sunday 18 January 2026 01:56:13 -0500 (0:00:00.647) 0:11:34.376 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Sunday 18 January 2026 01:56:13 -0500 (0:00:00.484) 0:11:34.861 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Sunday 18 January 2026 01:56:14 -0500 (0:00:00.490) 0:11:35.351 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Sunday 18 January 2026 01:56:14 -0500 (0:00:00.451) 0:11:35.803 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Sunday 18 January 2026 01:56:15 -0500 (0:00:00.581) 0:11:36.384 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Sunday 18 January 2026 01:56:15 -0500 (0:00:00.471) 0:11:36.855 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Sunday 18 January 2026 01:56:16 -0500 (0:00:00.444) 0:11:37.300 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Sunday 18 January 2026 01:56:16 -0500 (0:00:00.526) 0:11:37.826 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Sunday 18 January 2026 01:56:17 -0500 (0:00:00.733) 0:11:38.560 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Sunday 18 January 2026 01:56:17 -0500 (0:00:00.639) 0:11:39.199 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Sunday 18 January 2026 01:56:18 -0500 (0:00:00.548) 0:11:39.748 ******** ok: [managed-node9] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Sunday 18 January 2026 01:56:18 -0500 (0:00:00.370) 0:11:40.118 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Sunday 18 January 2026 01:56:19 -0500 (0:00:00.203) 0:11:40.322 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Sunday 18 January 2026 01:56:19 -0500 (0:00:00.482) 0:11:40.804 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Sunday 18 January 2026 01:56:19 -0500 (0:00:00.193) 0:11:40.998 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Sunday 18 January 2026 01:56:20 -0500 (0:00:00.220) 0:11:41.219 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Sunday 18 January 2026 01:56:20 -0500 (0:00:00.292) 0:11:41.512 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Sunday 18 January 2026 01:56:20 -0500 (0:00:00.252) 0:11:41.765 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Sunday 18 January 2026 01:56:20 -0500 (0:00:00.233) 0:11:41.998 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Sunday 18 January 2026 01:56:21 -0500 (0:00:00.255) 0:11:42.254 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Sunday 18 January 2026 01:56:21 -0500 (0:00:00.229) 0:11:42.483 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Sunday 18 January 2026 01:56:21 -0500 (0:00:00.290) 0:11:42.774 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Sunday 18 January 2026 01:56:22 -0500 (0:00:00.479) 0:11:43.253 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Sunday 18 January 2026 01:56:22 -0500 (0:00:00.337) 0:11:43.591 ******** changed: [managed-node9] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 4] ****************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:286 Sunday 18 January 2026 01:56:23 -0500 (0:00:01.008) 0:11:44.599 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node9 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Sunday 18 January 2026 01:56:24 -0500 (0:00:01.108) 0:11:45.708 ******** ok: [managed-node9] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Sunday 18 January 2026 01:56:25 -0500 (0:00:00.669) 0:11:46.378 ******** included: fedora.linux_system_roles.storage for managed-node9 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 18 January 2026 01:56:25 -0500 (0:00:00.526) 0:11:46.904 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 18 January 2026 01:56:26 -0500 (0:00:00.438) 0:11:47.343 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 18 January 2026 01:56:26 -0500 (0:00:00.556) 0:11:47.899 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 18 January 2026 01:56:27 -0500 (0:00:00.723) 0:11:48.623 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 18 January 2026 01:56:27 -0500 (0:00:00.320) 0:11:48.944 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 18 January 2026 01:56:28 -0500 (0:00:00.284) 0:11:49.229 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 18 January 2026 01:56:28 -0500 (0:00:00.200) 0:11:49.430 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 18 January 2026 01:56:28 -0500 (0:00:00.169) 0:11:49.599 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 18 January 2026 01:56:29 -0500 (0:00:00.784) 0:11:50.384 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 18 January 2026 01:56:31 -0500 (0:00:02.114) 0:11:52.498 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 18 January 2026 01:56:31 -0500 (0:00:00.558) 0:11:53.057 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 18 January 2026 01:56:32 -0500 (0:00:00.539) 0:11:53.596 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 18 January 2026 01:56:35 -0500 (0:00:02.642) 0:11:56.239 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 18 January 2026 01:56:35 -0500 (0:00:00.469) 0:11:56.708 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 18 January 2026 01:56:36 -0500 (0:00:00.610) 0:11:57.319 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 18 January 2026 01:56:36 -0500 (0:00:00.535) 0:11:57.855 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 18 January 2026 01:56:37 -0500 (0:00:00.613) 0:11:58.469 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 18 January 2026 01:56:39 -0500 (0:00:02.257) 0:12:00.726 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service": { "name": "systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 18 January 2026 01:56:42 -0500 (0:00:03.050) 0:12:03.776 ******** ok: [managed-node9] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 18 January 2026 01:56:43 -0500 (0:00:00.892) 0:12:04.669 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2d011d559f\x2dbdd6\x2d4d26\x2d94dc\x2deebd371e9926.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service", "name": "systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket \"system-systemd\\\\x2dcryptsetup.slice\" systemd-udevd-kernel.socket cryptsetup-pre.target dev-sda1.device", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target \"blockdev@dev-mapper-luks\\\\x2d011d559f\\\\x2dbdd6\\\\x2d4d26\\\\x2d94dc\\\\x2deebd371e9926.target\"", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-011d559f-bdd6-4d26-94dc-eebd371e9926", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-011d559f-bdd6-4d26-94dc-eebd371e9926 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-011d559f-bdd6-4d26-94dc-eebd371e9926 /dev/sda1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-011d559f-bdd6-4d26-94dc-eebd371e9926 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-011d559f-bdd6-4d26-94dc-eebd371e9926 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d011d559f\\\\x2dbdd6\\\\x2d4d26\\\\x2d94dc\\\\x2deebd371e9926.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2026-01-18 01:54:25 EST", "StateChangeTimestampMonotonic": "10843692085", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d011d559f\\\\x2dbdd6\\\\x2d4d26\\\\x2d94dc\\\\x2deebd371e9926.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 18 January 2026 01:56:45 -0500 (0:00:01.650) 0:12:06.319 ******** fatal: [managed-node9]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Sunday 18 January 2026 01:56:47 -0500 (0:00:02.520) 0:12:08.840 ******** fatal: [managed-node9]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 18 January 2026 01:56:48 -0500 (0:00:00.409) 0:12:09.249 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2d011d559f\x2dbdd6\x2d4d26\x2d94dc\x2deebd371e9926.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service", "name": "systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454172672", "LimitMEMLOCKSoft": "454172672", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d011d559f\\x2dbdd6\\x2d4d26\\x2d94dc\\x2deebd371e9926.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d011d559f\\\\x2dbdd6\\\\x2d4d26\\\\x2d94dc\\\\x2deebd371e9926.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Sunday 18 January 2026 01:56:49 -0500 (0:00:01.628) 0:12:10.878 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Sunday 18 January 2026 01:56:49 -0500 (0:00:00.320) 0:12:11.198 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Sunday 18 January 2026 01:56:50 -0500 (0:00:00.414) 0:12:11.612 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Sunday 18 January 2026 01:56:50 -0500 (0:00:00.325) 0:12:11.938 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719383.2721622, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1768719383.2721622, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1768719383.2721622, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2717766711", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Sunday 18 January 2026 01:56:51 -0500 (0:00:01.270) 0:12:13.208 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:312 Sunday 18 January 2026 01:56:52 -0500 (0:00:00.349) 0:12:13.558 ******** ok: [managed-node9] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_test0ncq9mm1lukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:319 Sunday 18 January 2026 01:56:56 -0500 (0:00:03.892) 0:12:17.450 ******** ok: [managed-node9] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_test0ncq9mm1lukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1768719416.5841947-153489-123009776464209/.source", "state": "file", "uid": 0 } TASK [Add encryption to the volume - 2] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:326 Sunday 18 January 2026 01:57:01 -0500 (0:00:05.032) 0:12:22.482 ******** included: fedora.linux_system_roles.storage for managed-node9 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 18 January 2026 01:57:01 -0500 (0:00:00.361) 0:12:22.844 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 18 January 2026 01:57:01 -0500 (0:00:00.328) 0:12:23.173 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 18 January 2026 01:57:04 -0500 (0:00:02.357) 0:12:25.530 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 18 January 2026 01:57:05 -0500 (0:00:00.902) 0:12:26.433 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 18 January 2026 01:57:05 -0500 (0:00:00.315) 0:12:26.749 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 18 January 2026 01:57:05 -0500 (0:00:00.306) 0:12:27.055 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 18 January 2026 01:57:06 -0500 (0:00:00.189) 0:12:27.245 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 18 January 2026 01:57:06 -0500 (0:00:00.196) 0:12:27.441 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 18 January 2026 01:57:06 -0500 (0:00:00.632) 0:12:28.074 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 18 January 2026 01:57:09 -0500 (0:00:02.314) 0:12:30.388 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_test0ncq9mm1lukskey", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 18 January 2026 01:57:09 -0500 (0:00:00.783) 0:12:31.172 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 18 January 2026 01:57:10 -0500 (0:00:00.459) 0:12:31.631 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 18 January 2026 01:57:12 -0500 (0:00:02.426) 0:12:34.058 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 18 January 2026 01:57:13 -0500 (0:00:00.553) 0:12:34.611 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 18 January 2026 01:57:13 -0500 (0:00:00.482) 0:12:35.094 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 18 January 2026 01:57:14 -0500 (0:00:00.638) 0:12:35.732 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 18 January 2026 01:57:15 -0500 (0:00:00.586) 0:12:36.319 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 18 January 2026 01:57:17 -0500 (0:00:02.307) 0:12:38.627 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 18 January 2026 01:57:23 -0500 (0:00:06.066) 0:12:44.693 ******** ok: [managed-node9] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 18 January 2026 01:57:24 -0500 (0:00:00.773) 0:12:45.467 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 18 January 2026 01:57:24 -0500 (0:00:00.173) 0:12:45.640 ******** changed: [managed-node9] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "password": "/tmp/storage_test0ncq9mm1lukskey", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test0ncq9mm1lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 18 January 2026 01:57:36 -0500 (0:00:12.270) 0:12:57.910 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 18 January 2026 01:57:37 -0500 (0:00:00.506) 0:12:58.416 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719308.8388143, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ef8104f455f5353f080e14b494bbb8fa510cd422", "ctime": 1768719308.8358142, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 574619848, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1768719308.8358142, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1478, "uid": 0, "version": "1014542963", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 18 January 2026 01:57:38 -0500 (0:00:01.209) 0:12:59.625 ******** ok: [managed-node9] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 18 January 2026 01:57:39 -0500 (0:00:01.173) 0:13:00.799 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 18 January 2026 01:57:39 -0500 (0:00:00.153) 0:13:00.952 ******** ok: [managed-node9] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "password": "/tmp/storage_test0ncq9mm1lukskey", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test0ncq9mm1lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 18 January 2026 01:57:40 -0500 (0:00:00.305) 0:13:01.258 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test0ncq9mm1lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 18 January 2026 01:57:40 -0500 (0:00:00.393) 0:13:01.651 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 18 January 2026 01:57:40 -0500 (0:00:00.201) 0:13:01.853 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node9] => (item={'src': 'UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=156fa78f-9b9a-4ca9-8a54-9fb077a20671" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 18 January 2026 01:57:42 -0500 (0:00:01.564) 0:13:03.417 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 18 January 2026 01:57:44 -0500 (0:00:02.084) 0:13:05.502 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node9] => (item={'src': '/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 18 January 2026 01:57:45 -0500 (0:00:01.522) 0:13:07.024 ******** skipping: [managed-node9] => (item={'src': '/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 18 January 2026 01:57:46 -0500 (0:00:00.778) 0:13:07.803 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 18 January 2026 01:57:48 -0500 (0:00:01.994) 0:13:09.797 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719322.7398794, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1768719314.6908417, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 364904648, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1768719314.6914968, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "250039508", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 18 January 2026 01:57:49 -0500 (0:00:01.329) 0:13:11.126 ******** changed: [managed-node9] => (item={'backing_device': '/dev/sda1', 'name': 'luks-77ccb785-87b2-452e-b620-a3b7571d55bc', 'password': '/tmp/storage_test0ncq9mm1lukskey', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "password": "/tmp/storage_test0ncq9mm1lukskey", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 18 January 2026 01:57:51 -0500 (0:00:01.572) 0:13:12.699 ******** ok: [managed-node9] TASK [Verify role results - 6] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:343 Sunday 18 January 2026 01:57:53 -0500 (0:00:01.935) 0:13:14.634 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node9 TASK [Print out pool information] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Sunday 18 January 2026 01:57:53 -0500 (0:00:00.385) 0:13:15.019 ******** ok: [managed-node9] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test0ncq9mm1lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Sunday 18 January 2026 01:57:54 -0500 (0:00:00.748) 0:13:15.768 ******** skipping: [managed-node9] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Sunday 18 January 2026 01:57:54 -0500 (0:00:00.418) 0:13:16.186 ******** ok: [managed-node9] => { "changed": false, "info": { "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "size": "4G", "type": "crypt", "uuid": "1c757a8e-043e-461a-9108-4e0b268418f8" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "77ccb785-87b2-452e-b620-a3b7571d55bc" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "a072e5ad-b1ec-4900-abeb-5d4679ecfcd5" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Sunday 18 January 2026 01:57:56 -0500 (0:00:01.095) 0:13:17.282 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003035", "end": "2026-01-18 01:57:57.108082", "rc": 0, "start": "2026-01-18 01:57:57.105047" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Mon Jan 5 14:46:37 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=a072e5ad-b1ec-4900-abeb-5d4679ecfcd5 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Sunday 18 January 2026 01:57:57 -0500 (0:00:01.193) 0:13:18.475 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003197", "end": "2026-01-18 01:57:58.323106", "failed_when_result": false, "rc": 0, "start": "2026-01-18 01:57:58.319909" } STDOUT: luks-77ccb785-87b2-452e-b620-a3b7571d55bc /dev/sda1 /tmp/storage_test0ncq9mm1lukskey TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Sunday 18 January 2026 01:57:58 -0500 (0:00:01.241) 0:13:19.716 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node9 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_test0ncq9mm1lukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Sunday 18 January 2026 01:57:59 -0500 (0:00:01.019) 0:13:20.736 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Sunday 18 January 2026 01:57:59 -0500 (0:00:00.299) 0:13:21.035 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Sunday 18 January 2026 01:58:00 -0500 (0:00:00.207) 0:13:21.242 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Sunday 18 January 2026 01:58:00 -0500 (0:00:00.255) 0:13:21.498 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node9 => (item=members) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node9 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Sunday 18 January 2026 01:58:00 -0500 (0:00:00.463) 0:13:21.961 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Sunday 18 January 2026 01:58:00 -0500 (0:00:00.159) 0:13:22.121 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Sunday 18 January 2026 01:58:01 -0500 (0:00:00.152) 0:13:22.273 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Sunday 18 January 2026 01:58:01 -0500 (0:00:00.242) 0:13:22.516 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Sunday 18 January 2026 01:58:01 -0500 (0:00:00.157) 0:13:22.673 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Sunday 18 January 2026 01:58:01 -0500 (0:00:00.213) 0:13:22.887 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Sunday 18 January 2026 01:58:01 -0500 (0:00:00.277) 0:13:23.164 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Sunday 18 January 2026 01:58:02 -0500 (0:00:00.198) 0:13:23.363 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Sunday 18 January 2026 01:58:02 -0500 (0:00:00.225) 0:13:23.589 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Sunday 18 January 2026 01:58:02 -0500 (0:00:00.196) 0:13:23.786 ******** ok: [managed-node9] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:103366): WARNING **: 01:58:03.497: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.1 1 Jul 2025 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.176 originally 10.31.45.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.176 originally 10.31.45.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bb59fe80f9' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.176 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Sunday 18 January 2026 01:58:03 -0500 (0:00:01.165) 0:13:24.951 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Sunday 18 January 2026 01:58:04 -0500 (0:00:00.554) 0:13:25.506 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node9 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Sunday 18 January 2026 01:58:04 -0500 (0:00:00.578) 0:13:26.084 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Sunday 18 January 2026 01:58:05 -0500 (0:00:00.240) 0:13:26.324 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Sunday 18 January 2026 01:58:05 -0500 (0:00:00.211) 0:13:26.536 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Sunday 18 January 2026 01:58:05 -0500 (0:00:00.198) 0:13:26.735 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Sunday 18 January 2026 01:58:05 -0500 (0:00:00.295) 0:13:27.030 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Sunday 18 January 2026 01:58:06 -0500 (0:00:00.302) 0:13:27.333 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Sunday 18 January 2026 01:58:06 -0500 (0:00:00.205) 0:13:27.539 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Sunday 18 January 2026 01:58:06 -0500 (0:00:00.205) 0:13:27.744 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Sunday 18 January 2026 01:58:06 -0500 (0:00:00.189) 0:13:27.933 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Sunday 18 January 2026 01:58:07 -0500 (0:00:00.287) 0:13:28.221 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Sunday 18 January 2026 01:58:07 -0500 (0:00:00.291) 0:13:28.512 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Sunday 18 January 2026 01:58:07 -0500 (0:00:00.285) 0:13:28.798 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node9 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Sunday 18 January 2026 01:58:08 -0500 (0:00:00.694) 0:13:29.492 ******** skipping: [managed-node9] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_test0ncq9mm1lukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test0ncq9mm1lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Sunday 18 January 2026 01:58:08 -0500 (0:00:00.293) 0:13:29.786 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node9 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Sunday 18 January 2026 01:58:09 -0500 (0:00:00.455) 0:13:30.241 ******** skipping: [managed-node9] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_test0ncq9mm1lukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test0ncq9mm1lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Sunday 18 January 2026 01:58:09 -0500 (0:00:00.228) 0:13:30.469 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node9 TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Sunday 18 January 2026 01:58:10 -0500 (0:00:00.771) 0:13:31.241 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Sunday 18 January 2026 01:58:10 -0500 (0:00:00.602) 0:13:31.844 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Sunday 18 January 2026 01:58:10 -0500 (0:00:00.135) 0:13:31.979 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Sunday 18 January 2026 01:58:10 -0500 (0:00:00.162) 0:13:32.141 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Sunday 18 January 2026 01:58:11 -0500 (0:00:00.224) 0:13:32.366 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node9 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Sunday 18 January 2026 01:58:11 -0500 (0:00:00.749) 0:13:33.115 ******** skipping: [managed-node9] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_test0ncq9mm1lukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test0ncq9mm1lukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Sunday 18 January 2026 01:58:12 -0500 (0:00:00.358) 0:13:33.474 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node9 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Sunday 18 January 2026 01:58:12 -0500 (0:00:00.668) 0:13:34.142 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Sunday 18 January 2026 01:58:13 -0500 (0:00:00.258) 0:13:34.401 ******** skipping: [managed-node9] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Sunday 18 January 2026 01:58:13 -0500 (0:00:00.191) 0:13:34.593 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Sunday 18 January 2026 01:58:13 -0500 (0:00:00.226) 0:13:34.819 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Sunday 18 January 2026 01:58:13 -0500 (0:00:00.196) 0:13:35.015 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Sunday 18 January 2026 01:58:14 -0500 (0:00:00.205) 0:13:35.221 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Sunday 18 January 2026 01:58:14 -0500 (0:00:00.205) 0:13:35.426 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Sunday 18 January 2026 01:58:14 -0500 (0:00:00.392) 0:13:35.818 ******** ok: [managed-node9] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Sunday 18 January 2026 01:58:14 -0500 (0:00:00.224) 0:13:36.043 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node9 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_test0ncq9mm1lukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Sunday 18 January 2026 01:58:15 -0500 (0:00:00.499) 0:13:36.542 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Sunday 18 January 2026 01:58:15 -0500 (0:00:00.524) 0:13:37.067 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node9 => (item=mount) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node9 => (item=fstab) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node9 => (item=fs) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node9 => (item=device) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node9 => (item=encryption) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node9 => (item=md) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node9 => (item=size) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node9 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Sunday 18 January 2026 01:58:17 -0500 (0:00:02.082) 0:13:39.149 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Sunday 18 January 2026 01:58:18 -0500 (0:00:00.393) 0:13:39.543 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Sunday 18 January 2026 01:58:19 -0500 (0:00:00.671) 0:13:40.214 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Sunday 18 January 2026 01:58:19 -0500 (0:00:00.709) 0:13:40.924 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Sunday 18 January 2026 01:58:20 -0500 (0:00:00.429) 0:13:41.354 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Sunday 18 January 2026 01:58:20 -0500 (0:00:00.722) 0:13:42.076 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Sunday 18 January 2026 01:58:21 -0500 (0:00:00.628) 0:13:42.704 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Sunday 18 January 2026 01:58:22 -0500 (0:00:00.651) 0:13:43.356 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Sunday 18 January 2026 01:58:22 -0500 (0:00:00.250) 0:13:43.606 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Sunday 18 January 2026 01:58:22 -0500 (0:00:00.306) 0:13:43.912 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Sunday 18 January 2026 01:58:23 -0500 (0:00:01.265) 0:13:45.178 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Sunday 18 January 2026 01:58:24 -0500 (0:00:00.285) 0:13:45.463 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Sunday 18 January 2026 01:58:25 -0500 (0:00:01.135) 0:13:46.599 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Sunday 18 January 2026 01:58:26 -0500 (0:00:00.679) 0:13:47.278 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Sunday 18 January 2026 01:58:26 -0500 (0:00:00.636) 0:13:47.915 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Sunday 18 January 2026 01:58:27 -0500 (0:00:00.563) 0:13:48.478 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Sunday 18 January 2026 01:58:27 -0500 (0:00:00.658) 0:13:49.137 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Sunday 18 January 2026 01:58:28 -0500 (0:00:00.269) 0:13:49.407 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Sunday 18 January 2026 01:58:28 -0500 (0:00:00.723) 0:13:50.130 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Sunday 18 January 2026 01:58:29 -0500 (0:00:00.703) 0:13:50.834 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719455.937502, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1768719455.937502, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1663, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1768719455.937502, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Sunday 18 January 2026 01:58:30 -0500 (0:00:01.111) 0:13:51.945 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Sunday 18 January 2026 01:58:31 -0500 (0:00:00.345) 0:13:52.290 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Sunday 18 January 2026 01:58:31 -0500 (0:00:00.308) 0:13:52.599 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Sunday 18 January 2026 01:58:31 -0500 (0:00:00.400) 0:13:53.000 ******** ok: [managed-node9] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Sunday 18 January 2026 01:58:32 -0500 (0:00:00.257) 0:13:53.257 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Sunday 18 January 2026 01:58:32 -0500 (0:00:00.233) 0:13:53.491 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Sunday 18 January 2026 01:58:32 -0500 (0:00:00.287) 0:13:53.778 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719456.4185042, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1768719456.4185042, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1703, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1768719456.4185042, "nlink": 1, "path": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Sunday 18 January 2026 01:58:33 -0500 (0:00:01.189) 0:13:54.967 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Sunday 18 January 2026 01:58:35 -0500 (0:00:02.151) 0:13:57.118 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.008109", "end": "2026-01-18 01:58:37.132160", "rc": 0, "start": "2026-01-18 01:58:37.124051" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 77ccb785-87b2-452e-b620-a3b7571d55bc Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 668728 Threads: 2 Salt: 9f 16 67 fc ba 50 89 0f 72 3f 43 79 da 3a 23 29 e6 66 ac b1 d0 2f af 53 c1 3d 70 94 ab 8f 8b 32 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 133474 Salt: 14 5c 29 b7 88 ce 71 e1 1a aa f4 17 12 31 0d 38 27 5a da c1 d5 8f 32 11 cf 7c 82 74 9f 3c 2e 40 Digest: d4 65 31 50 e8 e7 e5 8d 5e 7b a1 21 70 a7 db 98 8c 2b 88 f3 0b cb 84 e2 f8 9d 32 3c ba b6 46 b0 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Sunday 18 January 2026 01:58:37 -0500 (0:00:01.403) 0:13:58.522 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Sunday 18 January 2026 01:58:38 -0500 (0:00:00.759) 0:13:59.282 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Sunday 18 January 2026 01:58:38 -0500 (0:00:00.819) 0:14:00.102 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Sunday 18 January 2026 01:58:39 -0500 (0:00:00.347) 0:14:00.449 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Sunday 18 January 2026 01:58:39 -0500 (0:00:00.310) 0:14:00.760 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.encryption_luks_version is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Sunday 18 January 2026 01:58:40 -0500 (0:00:00.523) 0:14:01.284 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Sunday 18 January 2026 01:58:40 -0500 (0:00:00.361) 0:14:01.646 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Sunday 18 January 2026 01:58:40 -0500 (0:00:00.324) 0:14:01.971 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-77ccb785-87b2-452e-b620-a3b7571d55bc /dev/sda1 /tmp/storage_test0ncq9mm1lukskey" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "/tmp/storage_test0ncq9mm1lukskey" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Sunday 18 January 2026 01:58:41 -0500 (0:00:00.754) 0:14:02.725 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Sunday 18 January 2026 01:58:42 -0500 (0:00:00.565) 0:14:03.290 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Sunday 18 January 2026 01:58:42 -0500 (0:00:00.621) 0:14:03.911 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Sunday 18 January 2026 01:58:43 -0500 (0:00:00.584) 0:14:04.496 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Sunday 18 January 2026 01:58:43 -0500 (0:00:00.624) 0:14:05.121 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Sunday 18 January 2026 01:58:44 -0500 (0:00:00.264) 0:14:05.385 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Sunday 18 January 2026 01:58:44 -0500 (0:00:00.211) 0:14:05.597 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Sunday 18 January 2026 01:58:44 -0500 (0:00:00.242) 0:14:05.839 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Sunday 18 January 2026 01:58:44 -0500 (0:00:00.180) 0:14:06.020 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Sunday 18 January 2026 01:58:45 -0500 (0:00:00.376) 0:14:06.396 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Sunday 18 January 2026 01:58:45 -0500 (0:00:00.206) 0:14:06.602 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Sunday 18 January 2026 01:58:45 -0500 (0:00:00.223) 0:14:06.826 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Sunday 18 January 2026 01:58:45 -0500 (0:00:00.239) 0:14:07.066 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Sunday 18 January 2026 01:58:46 -0500 (0:00:00.213) 0:14:07.279 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Sunday 18 January 2026 01:58:46 -0500 (0:00:00.235) 0:14:07.515 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Sunday 18 January 2026 01:58:46 -0500 (0:00:00.212) 0:14:07.728 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Sunday 18 January 2026 01:58:47 -0500 (0:00:00.541) 0:14:08.270 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Sunday 18 January 2026 01:58:47 -0500 (0:00:00.562) 0:14:08.832 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Sunday 18 January 2026 01:58:48 -0500 (0:00:00.577) 0:14:09.410 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Sunday 18 January 2026 01:58:48 -0500 (0:00:00.220) 0:14:09.630 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Sunday 18 January 2026 01:58:48 -0500 (0:00:00.526) 0:14:10.156 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Sunday 18 January 2026 01:58:49 -0500 (0:00:00.709) 0:14:10.866 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Sunday 18 January 2026 01:58:50 -0500 (0:00:00.557) 0:14:11.423 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Sunday 18 January 2026 01:58:50 -0500 (0:00:00.547) 0:14:11.971 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Sunday 18 January 2026 01:58:51 -0500 (0:00:00.605) 0:14:12.577 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Sunday 18 January 2026 01:58:52 -0500 (0:00:00.719) 0:14:13.296 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Sunday 18 January 2026 01:58:52 -0500 (0:00:00.605) 0:14:13.901 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Sunday 18 January 2026 01:58:53 -0500 (0:00:00.529) 0:14:14.431 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Sunday 18 January 2026 01:58:53 -0500 (0:00:00.614) 0:14:15.046 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Sunday 18 January 2026 01:58:54 -0500 (0:00:00.601) 0:14:15.647 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Sunday 18 January 2026 01:58:55 -0500 (0:00:00.680) 0:14:16.328 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Sunday 18 January 2026 01:58:55 -0500 (0:00:00.653) 0:14:16.981 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Sunday 18 January 2026 01:58:56 -0500 (0:00:00.635) 0:14:17.617 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Sunday 18 January 2026 01:58:57 -0500 (0:00:00.810) 0:14:18.428 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Sunday 18 January 2026 01:58:57 -0500 (0:00:00.594) 0:14:19.023 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Sunday 18 January 2026 01:58:58 -0500 (0:00:00.753) 0:14:19.776 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Sunday 18 January 2026 01:58:59 -0500 (0:00:00.566) 0:14:20.342 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Sunday 18 January 2026 01:58:59 -0500 (0:00:00.567) 0:14:20.910 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Sunday 18 January 2026 01:59:00 -0500 (0:00:00.530) 0:14:21.440 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Sunday 18 January 2026 01:59:00 -0500 (0:00:00.640) 0:14:22.081 ******** ok: [managed-node9] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Sunday 18 January 2026 01:59:01 -0500 (0:00:00.276) 0:14:22.358 ******** ok: [managed-node9] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Sunday 18 January 2026 01:59:01 -0500 (0:00:00.293) 0:14:22.652 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Sunday 18 January 2026 01:59:01 -0500 (0:00:00.450) 0:14:23.102 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Sunday 18 January 2026 01:59:02 -0500 (0:00:00.187) 0:14:23.289 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Sunday 18 January 2026 01:59:02 -0500 (0:00:00.217) 0:14:23.510 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Sunday 18 January 2026 01:59:02 -0500 (0:00:00.226) 0:14:23.737 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Sunday 18 January 2026 01:59:02 -0500 (0:00:00.205) 0:14:23.942 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Sunday 18 January 2026 01:59:02 -0500 (0:00:00.221) 0:14:24.164 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Sunday 18 January 2026 01:59:03 -0500 (0:00:00.209) 0:14:24.373 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Sunday 18 January 2026 01:59:03 -0500 (0:00:00.217) 0:14:24.590 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Sunday 18 January 2026 01:59:03 -0500 (0:00:00.225) 0:14:24.816 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Sunday 18 January 2026 01:59:04 -0500 (0:00:00.502) 0:14:25.318 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:349 Sunday 18 January 2026 01:59:04 -0500 (0:00:00.221) 0:14:25.540 ******** ok: [managed-node9] => { "changed": false, "path": "/tmp/storage_test0ncq9mm1lukskey", "state": "absent" } TASK [Test for correct handling of new encrypted volume w/ no key - 3] ********* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:359 Sunday 18 January 2026 01:59:05 -0500 (0:00:01.151) 0:14:26.692 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node9 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Sunday 18 January 2026 01:59:05 -0500 (0:00:00.409) 0:14:27.101 ******** ok: [managed-node9] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Sunday 18 January 2026 01:59:06 -0500 (0:00:00.732) 0:14:27.833 ******** included: fedora.linux_system_roles.storage for managed-node9 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 18 January 2026 01:59:07 -0500 (0:00:00.425) 0:14:28.259 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 18 January 2026 01:59:07 -0500 (0:00:00.376) 0:14:28.636 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 18 January 2026 01:59:08 -0500 (0:00:00.677) 0:14:29.314 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 18 January 2026 01:59:08 -0500 (0:00:00.776) 0:14:30.090 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 18 January 2026 01:59:09 -0500 (0:00:00.352) 0:14:30.442 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 18 January 2026 01:59:09 -0500 (0:00:00.213) 0:14:30.655 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 18 January 2026 01:59:09 -0500 (0:00:00.251) 0:14:30.907 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 18 January 2026 01:59:10 -0500 (0:00:00.318) 0:14:31.225 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 18 January 2026 01:59:10 -0500 (0:00:00.845) 0:14:32.071 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 18 January 2026 01:59:13 -0500 (0:00:02.416) 0:14:34.487 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 18 January 2026 01:59:13 -0500 (0:00:00.714) 0:14:35.202 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 18 January 2026 01:59:14 -0500 (0:00:00.713) 0:14:35.916 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 18 January 2026 01:59:17 -0500 (0:00:02.714) 0:14:38.631 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 18 January 2026 01:59:17 -0500 (0:00:00.571) 0:14:39.202 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 18 January 2026 01:59:18 -0500 (0:00:00.716) 0:14:39.919 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 18 January 2026 01:59:19 -0500 (0:00:00.590) 0:14:40.510 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 18 January 2026 01:59:19 -0500 (0:00:00.504) 0:14:41.014 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 18 January 2026 01:59:21 -0500 (0:00:02.189) 0:14:43.204 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 18 January 2026 01:59:25 -0500 (0:00:03.218) 0:14:46.423 ******** ok: [managed-node9] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 18 January 2026 01:59:26 -0500 (0:00:00.997) 0:14:47.420 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 18 January 2026 01:59:26 -0500 (0:00:00.241) 0:14:47.662 ******** fatal: [managed-node9]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Sunday 18 January 2026 01:59:29 -0500 (0:00:02.707) 0:14:50.370 ******** fatal: [managed-node9]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "encrypted volume 'test1' missing key/password", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 18 January 2026 01:59:29 -0500 (0:00:00.454) 0:14:50.824 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Sunday 18 January 2026 01:59:29 -0500 (0:00:00.253) 0:14:51.078 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Sunday 18 January 2026 01:59:30 -0500 (0:00:00.246) 0:14:51.324 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Sunday 18 January 2026 01:59:30 -0500 (0:00:00.440) 0:14:51.765 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:377 Sunday 18 January 2026 01:59:30 -0500 (0:00:00.258) 0:14:52.023 ******** included: fedora.linux_system_roles.storage for managed-node9 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 18 January 2026 01:59:31 -0500 (0:00:00.514) 0:14:52.537 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 18 January 2026 01:59:31 -0500 (0:00:00.369) 0:14:52.906 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 18 January 2026 01:59:32 -0500 (0:00:00.521) 0:14:53.428 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 18 January 2026 01:59:33 -0500 (0:00:00.824) 0:14:54.253 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 18 January 2026 01:59:33 -0500 (0:00:00.332) 0:14:54.586 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 18 January 2026 01:59:33 -0500 (0:00:00.221) 0:14:54.808 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 18 January 2026 01:59:33 -0500 (0:00:00.232) 0:14:55.040 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 18 January 2026 01:59:34 -0500 (0:00:00.268) 0:14:55.309 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 18 January 2026 01:59:34 -0500 (0:00:00.681) 0:14:55.990 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 18 January 2026 01:59:36 -0500 (0:00:02.162) 0:14:58.153 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 18 January 2026 01:59:37 -0500 (0:00:00.732) 0:14:58.885 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 18 January 2026 01:59:38 -0500 (0:00:00.627) 0:14:59.513 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 18 January 2026 01:59:41 -0500 (0:00:02.757) 0:15:02.270 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 18 January 2026 01:59:41 -0500 (0:00:00.547) 0:15:02.818 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 18 January 2026 01:59:42 -0500 (0:00:00.499) 0:15:03.317 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 18 January 2026 01:59:42 -0500 (0:00:00.596) 0:15:03.914 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 18 January 2026 01:59:43 -0500 (0:00:00.530) 0:15:04.444 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 18 January 2026 01:59:45 -0500 (0:00:02.282) 0:15:06.727 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 18 January 2026 01:59:48 -0500 (0:00:03.235) 0:15:09.963 ******** ok: [managed-node9] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 18 January 2026 01:59:49 -0500 (0:00:00.799) 0:15:10.762 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 18 January 2026 01:59:49 -0500 (0:00:00.215) 0:15:10.977 ******** changed: [managed-node9] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-624bac9e-643d-439d-b06b-2978f908f664", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 18 January 2026 02:00:01 -0500 (0:00:11.492) 0:15:22.470 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 18 January 2026 02:00:01 -0500 (0:00:00.473) 0:15:22.944 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719465.6265473, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "f4689a4ee1b52c88d334f2545623ffa5d5f9a9a5", "ctime": 1768719465.6235473, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 574619848, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1768719465.6235473, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1490, "uid": 0, "version": "1014542963", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 18 January 2026 02:00:02 -0500 (0:00:01.145) 0:15:24.089 ******** ok: [managed-node9] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 18 January 2026 02:00:04 -0500 (0:00:01.228) 0:15:25.318 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 18 January 2026 02:00:04 -0500 (0:00:00.184) 0:15:25.503 ******** ok: [managed-node9] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-624bac9e-643d-439d-b06b-2978f908f664", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 18 January 2026 02:00:04 -0500 (0:00:00.425) 0:15:25.928 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 18 January 2026 02:00:05 -0500 (0:00:00.419) 0:15:26.348 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 18 January 2026 02:00:06 -0500 (0:00:01.366) 0:15:27.715 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node9] => (item={'src': '/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-77ccb785-87b2-452e-b620-a3b7571d55bc" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 18 January 2026 02:00:08 -0500 (0:00:01.886) 0:15:29.601 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 18 January 2026 02:00:10 -0500 (0:00:01.933) 0:15:31.535 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node9] => (item={'src': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 18 January 2026 02:00:12 -0500 (0:00:01.822) 0:15:33.357 ******** skipping: [managed-node9] => (item={'src': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 18 January 2026 02:00:12 -0500 (0:00:00.589) 0:15:33.947 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 18 January 2026 02:00:14 -0500 (0:00:01.912) 0:15:35.860 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719478.3216066, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "d2cd552fdca7a41a8b7fdf6453451ef901ca3762", "ctime": 1768719471.325574, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 629145802, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1768719471.3265076, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 85, "uid": 0, "version": "3225175010", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 18 January 2026 02:00:15 -0500 (0:00:01.128) 0:15:36.988 ******** changed: [managed-node9] => (item={'backing_device': '/dev/sda1', 'name': 'luks-77ccb785-87b2-452e-b620-a3b7571d55bc', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node9] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-624bac9e-643d-439d-b06b-2978f908f664', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-624bac9e-643d-439d-b06b-2978f908f664", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 18 January 2026 02:00:18 -0500 (0:00:02.577) 0:15:39.565 ******** ok: [managed-node9] TASK [Verify role results - 7] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:396 Sunday 18 January 2026 02:00:20 -0500 (0:00:02.019) 0:15:41.585 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node9 TASK [Print out pool information] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Sunday 18 January 2026 02:00:20 -0500 (0:00:00.463) 0:15:42.049 ******** ok: [managed-node9] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Sunday 18 January 2026 02:00:21 -0500 (0:00:00.694) 0:15:42.743 ******** skipping: [managed-node9] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Sunday 18 January 2026 02:00:22 -0500 (0:00:00.546) 0:15:43.290 ******** ok: [managed-node9] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "624bac9e-643d-439d-b06b-2978f908f664" }, "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "size": "4G", "type": "crypt", "uuid": "24a74452-32f4-49ab-864a-37839714674e" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "nF8Jg7-ZUfV-jV9Y-MYu6-Z3xj-fKdT-Gu9qOl" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "a072e5ad-b1ec-4900-abeb-5d4679ecfcd5" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Sunday 18 January 2026 02:00:23 -0500 (0:00:01.240) 0:15:44.530 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002979", "end": "2026-01-18 02:00:24.409041", "rc": 0, "start": "2026-01-18 02:00:24.406062" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Mon Jan 5 14:46:37 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=a072e5ad-b1ec-4900-abeb-5d4679ecfcd5 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Sunday 18 January 2026 02:00:24 -0500 (0:00:01.244) 0:15:45.774 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003136", "end": "2026-01-18 02:00:25.604639", "failed_when_result": false, "rc": 0, "start": "2026-01-18 02:00:25.601503" } STDOUT: luks-624bac9e-643d-439d-b06b-2978f908f664 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Sunday 18 January 2026 02:00:25 -0500 (0:00:01.234) 0:15:47.035 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node9 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Sunday 18 January 2026 02:00:26 -0500 (0:00:00.931) 0:15:47.967 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Sunday 18 January 2026 02:00:26 -0500 (0:00:00.179) 0:15:48.147 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.033111", "end": "2026-01-18 02:00:27.991554", "rc": 0, "start": "2026-01-18 02:00:27.958443" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Sunday 18 January 2026 02:00:28 -0500 (0:00:01.205) 0:15:49.353 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Sunday 18 January 2026 02:00:28 -0500 (0:00:00.415) 0:15:49.768 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node9 => (item=members) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node9 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Sunday 18 January 2026 02:00:29 -0500 (0:00:00.688) 0:15:50.457 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Sunday 18 January 2026 02:00:29 -0500 (0:00:00.730) 0:15:51.187 ******** ok: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Sunday 18 January 2026 02:00:32 -0500 (0:00:02.904) 0:15:54.091 ******** ok: [managed-node9] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Sunday 18 January 2026 02:00:33 -0500 (0:00:00.662) 0:15:54.754 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Sunday 18 January 2026 02:00:34 -0500 (0:00:00.775) 0:15:55.530 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Sunday 18 January 2026 02:00:34 -0500 (0:00:00.560) 0:15:56.103 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Sunday 18 January 2026 02:00:35 -0500 (0:00:00.204) 0:15:56.307 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Sunday 18 January 2026 02:00:35 -0500 (0:00:00.562) 0:15:56.869 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Sunday 18 January 2026 02:00:35 -0500 (0:00:00.240) 0:15:57.110 ******** ok: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Sunday 18 January 2026 02:00:36 -0500 (0:00:00.416) 0:15:57.526 ******** ok: [managed-node9] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:108607): WARNING **: 02:00:37.236: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.1 1 Jul 2025 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.176 originally 10.31.45.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.176 originally 10.31.45.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bb59fe80f9' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.176 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Sunday 18 January 2026 02:00:37 -0500 (0:00:01.142) 0:15:58.669 ******** skipping: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "storage_test_pool.grow_to_fill | bool", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Sunday 18 January 2026 02:00:37 -0500 (0:00:00.278) 0:15:58.947 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node9 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Sunday 18 January 2026 02:00:38 -0500 (0:00:00.566) 0:15:59.514 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Sunday 18 January 2026 02:00:38 -0500 (0:00:00.221) 0:15:59.735 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Sunday 18 January 2026 02:00:38 -0500 (0:00:00.222) 0:15:59.958 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Sunday 18 January 2026 02:00:38 -0500 (0:00:00.236) 0:16:00.194 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Sunday 18 January 2026 02:00:39 -0500 (0:00:00.178) 0:16:00.372 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Sunday 18 January 2026 02:00:39 -0500 (0:00:00.207) 0:16:00.580 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Sunday 18 January 2026 02:00:39 -0500 (0:00:00.199) 0:16:00.779 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Sunday 18 January 2026 02:00:39 -0500 (0:00:00.200) 0:16:00.980 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Sunday 18 January 2026 02:00:39 -0500 (0:00:00.176) 0:16:01.157 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Sunday 18 January 2026 02:00:40 -0500 (0:00:00.186) 0:16:01.343 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Sunday 18 January 2026 02:00:40 -0500 (0:00:00.232) 0:16:01.576 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Sunday 18 January 2026 02:00:40 -0500 (0:00:00.276) 0:16:01.852 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node9 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Sunday 18 January 2026 02:00:41 -0500 (0:00:00.463) 0:16:02.315 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node9 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Sunday 18 January 2026 02:00:41 -0500 (0:00:00.515) 0:16:02.831 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Sunday 18 January 2026 02:00:41 -0500 (0:00:00.300) 0:16:03.132 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Sunday 18 January 2026 02:00:42 -0500 (0:00:00.323) 0:16:03.456 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Sunday 18 January 2026 02:00:42 -0500 (0:00:00.235) 0:16:03.691 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Sunday 18 January 2026 02:00:42 -0500 (0:00:00.278) 0:16:03.970 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Sunday 18 January 2026 02:00:43 -0500 (0:00:00.321) 0:16:04.291 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Sunday 18 January 2026 02:00:43 -0500 (0:00:00.338) 0:16:04.630 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Sunday 18 January 2026 02:00:43 -0500 (0:00:00.276) 0:16:04.906 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node9 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Sunday 18 January 2026 02:00:44 -0500 (0:00:00.552) 0:16:05.458 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node9 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Sunday 18 January 2026 02:00:44 -0500 (0:00:00.402) 0:16:05.860 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Sunday 18 January 2026 02:00:44 -0500 (0:00:00.229) 0:16:06.090 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Sunday 18 January 2026 02:00:45 -0500 (0:00:00.264) 0:16:06.355 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Sunday 18 January 2026 02:00:45 -0500 (0:00:00.204) 0:16:06.559 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Sunday 18 January 2026 02:00:45 -0500 (0:00:00.246) 0:16:06.805 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node9 TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Sunday 18 January 2026 02:00:46 -0500 (0:00:00.651) 0:16:07.457 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Sunday 18 January 2026 02:00:46 -0500 (0:00:00.747) 0:16:08.205 ******** skipping: [managed-node9] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Sunday 18 January 2026 02:00:47 -0500 (0:00:00.324) 0:16:08.529 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node9 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Sunday 18 January 2026 02:00:47 -0500 (0:00:00.414) 0:16:08.943 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Sunday 18 January 2026 02:00:48 -0500 (0:00:00.707) 0:16:09.651 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Sunday 18 January 2026 02:00:49 -0500 (0:00:00.689) 0:16:10.340 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Sunday 18 January 2026 02:00:49 -0500 (0:00:00.603) 0:16:10.943 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Sunday 18 January 2026 02:00:50 -0500 (0:00:00.594) 0:16:11.538 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Sunday 18 January 2026 02:00:50 -0500 (0:00:00.595) 0:16:12.133 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Sunday 18 January 2026 02:00:51 -0500 (0:00:00.247) 0:16:12.381 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Sunday 18 January 2026 02:00:51 -0500 (0:00:00.274) 0:16:12.656 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node9 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Sunday 18 January 2026 02:00:52 -0500 (0:00:00.730) 0:16:13.404 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node9 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Sunday 18 January 2026 02:00:52 -0500 (0:00:00.616) 0:16:14.020 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Sunday 18 January 2026 02:00:53 -0500 (0:00:00.190) 0:16:14.210 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Sunday 18 January 2026 02:00:53 -0500 (0:00:00.168) 0:16:14.378 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Sunday 18 January 2026 02:00:53 -0500 (0:00:00.143) 0:16:14.521 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Sunday 18 January 2026 02:00:53 -0500 (0:00:00.220) 0:16:14.741 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Sunday 18 January 2026 02:00:53 -0500 (0:00:00.232) 0:16:14.974 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Sunday 18 January 2026 02:00:53 -0500 (0:00:00.200) 0:16:15.175 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Sunday 18 January 2026 02:00:54 -0500 (0:00:00.248) 0:16:15.423 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node9 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Sunday 18 January 2026 02:00:54 -0500 (0:00:00.695) 0:16:16.119 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Sunday 18 January 2026 02:00:55 -0500 (0:00:00.188) 0:16:16.307 ******** skipping: [managed-node9] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Sunday 18 January 2026 02:00:55 -0500 (0:00:00.220) 0:16:16.527 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Sunday 18 January 2026 02:00:55 -0500 (0:00:00.135) 0:16:16.663 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Sunday 18 January 2026 02:00:55 -0500 (0:00:00.178) 0:16:16.842 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Sunday 18 January 2026 02:00:55 -0500 (0:00:00.282) 0:16:17.124 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Sunday 18 January 2026 02:00:56 -0500 (0:00:00.242) 0:16:17.367 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Sunday 18 January 2026 02:00:56 -0500 (0:00:00.243) 0:16:17.610 ******** ok: [managed-node9] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Sunday 18 January 2026 02:00:56 -0500 (0:00:00.228) 0:16:17.839 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node9 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Sunday 18 January 2026 02:00:57 -0500 (0:00:00.372) 0:16:18.211 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Sunday 18 January 2026 02:00:57 -0500 (0:00:00.615) 0:16:18.826 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node9 => (item=mount) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node9 => (item=fstab) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node9 => (item=fs) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node9 => (item=device) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node9 => (item=encryption) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node9 => (item=md) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node9 => (item=size) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node9 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Sunday 18 January 2026 02:00:59 -0500 (0:00:02.109) 0:16:20.935 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Sunday 18 January 2026 02:01:00 -0500 (0:00:00.387) 0:16:21.322 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Sunday 18 January 2026 02:01:00 -0500 (0:00:00.754) 0:16:22.077 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Sunday 18 January 2026 02:01:01 -0500 (0:00:00.704) 0:16:22.782 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Sunday 18 January 2026 02:01:01 -0500 (0:00:00.420) 0:16:23.203 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Sunday 18 January 2026 02:01:02 -0500 (0:00:00.705) 0:16:23.908 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Sunday 18 January 2026 02:01:03 -0500 (0:00:00.781) 0:16:24.690 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Sunday 18 January 2026 02:01:04 -0500 (0:00:00.730) 0:16:25.420 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Sunday 18 January 2026 02:01:04 -0500 (0:00:00.209) 0:16:25.629 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Sunday 18 January 2026 02:01:04 -0500 (0:00:00.194) 0:16:25.867 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Sunday 18 January 2026 02:01:04 -0500 (0:00:00.224) 0:16:26.092 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Sunday 18 January 2026 02:01:05 -0500 (0:00:00.222) 0:16:26.315 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Sunday 18 January 2026 02:01:06 -0500 (0:00:00.906) 0:16:27.221 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Sunday 18 January 2026 02:01:06 -0500 (0:00:00.586) 0:16:27.808 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Sunday 18 January 2026 02:01:07 -0500 (0:00:00.665) 0:16:28.473 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Sunday 18 January 2026 02:01:07 -0500 (0:00:00.531) 0:16:29.004 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Sunday 18 January 2026 02:01:08 -0500 (0:00:00.524) 0:16:29.529 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Sunday 18 January 2026 02:01:08 -0500 (0:00:00.252) 0:16:29.781 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Sunday 18 January 2026 02:01:09 -0500 (0:00:00.608) 0:16:30.390 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Sunday 18 January 2026 02:01:09 -0500 (0:00:00.521) 0:16:30.912 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719600.4861777, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1768719600.4861777, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1928, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1768719600.4861777, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Sunday 18 January 2026 02:01:10 -0500 (0:00:01.238) 0:16:32.150 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Sunday 18 January 2026 02:01:11 -0500 (0:00:00.289) 0:16:32.440 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Sunday 18 January 2026 02:01:11 -0500 (0:00:00.243) 0:16:32.683 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Sunday 18 January 2026 02:01:11 -0500 (0:00:00.404) 0:16:33.088 ******** ok: [managed-node9] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Sunday 18 January 2026 02:01:12 -0500 (0:00:00.338) 0:16:33.427 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Sunday 18 January 2026 02:01:12 -0500 (0:00:00.257) 0:16:33.684 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Sunday 18 January 2026 02:01:12 -0500 (0:00:00.319) 0:16:34.003 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719600.95618, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1768719600.95618, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1976, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1768719600.95618, "nlink": 1, "path": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Sunday 18 January 2026 02:01:14 -0500 (0:00:01.297) 0:16:35.301 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Sunday 18 January 2026 02:01:16 -0500 (0:00:02.149) 0:16:37.450 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.006650", "end": "2026-01-18 02:01:17.441160", "rc": 0, "start": "2026-01-18 02:01:17.434510" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 16384 MK bits: 512 MK digest: 0d 79 05 81 fd 12 aa b4 ac 3b 42 74 05 f7 7e 33 3b 56 cb 7b MK salt: 15 75 e6 76 8a ce 29 95 4c 21 c5 20 6b 45 4e 89 43 d5 82 80 0e b9 1f d9 e0 df 97 16 46 10 55 0e MK iterations: 133474 UUID: 624bac9e-643d-439d-b06b-2978f908f664 Key Slot 0: ENABLED Iterations: 2135592 Salt: d6 58 a8 f2 f2 67 00 cc 38 8b 09 13 a7 3c e5 7d b4 31 a3 6d b6 9c 0a 7b d8 1f 0c f1 9b 7b 48 67 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Sunday 18 January 2026 02:01:17 -0500 (0:00:01.392) 0:16:38.843 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Sunday 18 January 2026 02:01:18 -0500 (0:00:00.663) 0:16:39.507 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Sunday 18 January 2026 02:01:18 -0500 (0:00:00.640) 0:16:40.147 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Sunday 18 January 2026 02:01:19 -0500 (0:00:00.284) 0:16:40.477 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Sunday 18 January 2026 02:01:19 -0500 (0:00:00.272) 0:16:40.750 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Sunday 18 January 2026 02:01:21 -0500 (0:00:02.066) 0:16:42.816 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Sunday 18 January 2026 02:01:23 -0500 (0:00:01.753) 0:16:44.570 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Sunday 18 January 2026 02:01:25 -0500 (0:00:01.769) 0:16:46.339 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-624bac9e-643d-439d-b06b-2978f908f664 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Sunday 18 January 2026 02:01:27 -0500 (0:00:02.420) 0:16:48.760 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Sunday 18 January 2026 02:01:28 -0500 (0:00:00.672) 0:16:49.433 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Sunday 18 January 2026 02:01:28 -0500 (0:00:00.744) 0:16:50.177 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Sunday 18 January 2026 02:01:29 -0500 (0:00:00.746) 0:16:50.923 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Sunday 18 January 2026 02:01:30 -0500 (0:00:00.699) 0:16:51.622 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Sunday 18 January 2026 02:01:30 -0500 (0:00:00.280) 0:16:51.903 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Sunday 18 January 2026 02:01:30 -0500 (0:00:00.235) 0:16:52.138 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Sunday 18 January 2026 02:01:31 -0500 (0:00:00.221) 0:16:52.359 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Sunday 18 January 2026 02:01:31 -0500 (0:00:00.172) 0:16:52.532 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Sunday 18 January 2026 02:01:31 -0500 (0:00:00.202) 0:16:52.734 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Sunday 18 January 2026 02:01:31 -0500 (0:00:00.186) 0:16:52.921 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Sunday 18 January 2026 02:01:31 -0500 (0:00:00.218) 0:16:53.140 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Sunday 18 January 2026 02:01:32 -0500 (0:00:00.298) 0:16:53.438 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Sunday 18 January 2026 02:01:32 -0500 (0:00:00.211) 0:16:53.649 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Sunday 18 January 2026 02:01:32 -0500 (0:00:00.214) 0:16:53.864 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Sunday 18 January 2026 02:01:32 -0500 (0:00:00.194) 0:16:54.058 ******** ok: [managed-node9] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Sunday 18 January 2026 02:01:36 -0500 (0:00:03.289) 0:16:57.348 ******** ok: [managed-node9] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Sunday 18 January 2026 02:01:37 -0500 (0:00:01.649) 0:16:58.997 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Sunday 18 January 2026 02:01:38 -0500 (0:00:00.799) 0:16:59.796 ******** ok: [managed-node9] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Sunday 18 January 2026 02:01:38 -0500 (0:00:00.340) 0:17:00.136 ******** ok: [managed-node9] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Sunday 18 January 2026 02:01:40 -0500 (0:00:01.608) 0:17:01.745 ******** skipping: [managed-node9] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Sunday 18 January 2026 02:01:41 -0500 (0:00:00.729) 0:17:02.474 ******** skipping: [managed-node9] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Sunday 18 January 2026 02:01:42 -0500 (0:00:00.746) 0:17:03.221 ******** skipping: [managed-node9] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Sunday 18 January 2026 02:01:42 -0500 (0:00:00.656) 0:17:03.878 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Sunday 18 January 2026 02:01:43 -0500 (0:00:00.763) 0:17:04.641 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Sunday 18 January 2026 02:01:44 -0500 (0:00:00.622) 0:17:05.264 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Sunday 18 January 2026 02:01:44 -0500 (0:00:00.601) 0:17:05.866 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Sunday 18 January 2026 02:01:45 -0500 (0:00:00.528) 0:17:06.395 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Sunday 18 January 2026 02:01:45 -0500 (0:00:00.704) 0:17:07.100 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Sunday 18 January 2026 02:01:46 -0500 (0:00:00.738) 0:17:07.838 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Sunday 18 January 2026 02:01:47 -0500 (0:00:00.637) 0:17:08.476 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Sunday 18 January 2026 02:01:47 -0500 (0:00:00.728) 0:17:09.205 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Sunday 18 January 2026 02:01:48 -0500 (0:00:00.647) 0:17:09.852 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Sunday 18 January 2026 02:01:49 -0500 (0:00:00.728) 0:17:10.581 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Sunday 18 January 2026 02:01:49 -0500 (0:00:00.593) 0:17:11.175 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Sunday 18 January 2026 02:01:50 -0500 (0:00:00.828) 0:17:12.004 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Sunday 18 January 2026 02:01:51 -0500 (0:00:00.783) 0:17:12.787 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Sunday 18 January 2026 02:01:52 -0500 (0:00:00.602) 0:17:13.389 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Sunday 18 January 2026 02:01:52 -0500 (0:00:00.667) 0:17:14.057 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Sunday 18 January 2026 02:01:53 -0500 (0:00:00.528) 0:17:14.585 ******** ok: [managed-node9] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Sunday 18 January 2026 02:01:53 -0500 (0:00:00.341) 0:17:14.927 ******** ok: [managed-node9] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Sunday 18 January 2026 02:01:54 -0500 (0:00:00.337) 0:17:15.264 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Sunday 18 January 2026 02:01:54 -0500 (0:00:00.600) 0:17:15.865 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.030702", "end": "2026-01-18 02:01:55.724626", "rc": 0, "start": "2026-01-18 02:01:55.693924" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Sunday 18 January 2026 02:01:55 -0500 (0:00:01.244) 0:17:17.109 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Sunday 18 January 2026 02:01:56 -0500 (0:00:00.544) 0:17:17.670 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Sunday 18 January 2026 02:01:57 -0500 (0:00:00.657) 0:17:18.328 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Sunday 18 January 2026 02:01:57 -0500 (0:00:00.640) 0:17:18.969 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Sunday 18 January 2026 02:01:58 -0500 (0:00:00.580) 0:17:19.549 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Sunday 18 January 2026 02:01:59 -0500 (0:00:00.662) 0:17:20.212 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Sunday 18 January 2026 02:01:59 -0500 (0:00:00.567) 0:17:20.780 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Sunday 18 January 2026 02:01:59 -0500 (0:00:00.255) 0:17:21.035 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Sunday 18 January 2026 02:02:00 -0500 (0:00:00.569) 0:17:21.605 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:399 Sunday 18 January 2026 02:02:00 -0500 (0:00:00.302) 0:17:21.907 ******** included: fedora.linux_system_roles.storage for managed-node9 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 18 January 2026 02:02:01 -0500 (0:00:00.708) 0:17:22.616 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 18 January 2026 02:02:01 -0500 (0:00:00.329) 0:17:22.945 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 18 January 2026 02:02:02 -0500 (0:00:00.693) 0:17:23.639 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 18 January 2026 02:02:03 -0500 (0:00:00.826) 0:17:24.466 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 18 January 2026 02:02:03 -0500 (0:00:00.402) 0:17:24.868 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 18 January 2026 02:02:03 -0500 (0:00:00.243) 0:17:25.111 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 18 January 2026 02:02:04 -0500 (0:00:00.344) 0:17:25.456 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 18 January 2026 02:02:04 -0500 (0:00:00.255) 0:17:25.712 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 18 January 2026 02:02:05 -0500 (0:00:00.788) 0:17:26.500 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 18 January 2026 02:02:07 -0500 (0:00:02.468) 0:17:28.969 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 18 January 2026 02:02:08 -0500 (0:00:00.825) 0:17:29.794 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 18 January 2026 02:02:09 -0500 (0:00:00.523) 0:17:30.318 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 18 January 2026 02:02:12 -0500 (0:00:02.955) 0:17:33.273 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 18 January 2026 02:02:12 -0500 (0:00:00.395) 0:17:33.669 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 18 January 2026 02:02:12 -0500 (0:00:00.493) 0:17:34.163 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 18 January 2026 02:02:13 -0500 (0:00:00.653) 0:17:34.816 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 18 January 2026 02:02:14 -0500 (0:00:00.639) 0:17:35.455 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 18 January 2026 02:02:16 -0500 (0:00:02.426) 0:17:37.882 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d77ccb785\\x2d87b2\\x2d452e\\x2db620\\x2da3b7571d55bc.service": { "name": "systemd-cryptsetup@luks\\x2d77ccb785\\x2d87b2\\x2d452e\\x2db620\\x2da3b7571d55bc.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 18 January 2026 02:02:19 -0500 (0:00:03.085) 0:17:40.967 ******** ok: [managed-node9] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d77ccb785\\x2d87b2\\x2d452e\\x2db620\\x2da3b7571d55bc.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 18 January 2026 02:02:20 -0500 (0:00:00.603) 0:17:41.583 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2d77ccb785\x2d87b2\x2d452e\x2db620\x2da3b7571d55bc.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d77ccb785\\x2d87b2\\x2d452e\\x2db620\\x2da3b7571d55bc.service", "name": "systemd-cryptsetup@luks\\x2d77ccb785\\x2d87b2\\x2d452e\\x2db620\\x2da3b7571d55bc.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-udevd-kernel.socket \"system-systemd\\\\x2dcryptsetup.slice\" systemd-journald.socket -.mount tmp.mount dev-sda1.device cryptsetup-pre.target", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "\"blockdev@dev-mapper-luks\\\\x2d77ccb785\\\\x2d87b2\\\\x2d452e\\\\x2db620\\\\x2da3b7571d55bc.target\" cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-77ccb785-87b2-452e-b620-a3b7571d55bc", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-77ccb785-87b2-452e-b620-a3b7571d55bc /dev/sda1 /tmp/storage_test0ncq9mm1lukskey ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-77ccb785-87b2-452e-b620-a3b7571d55bc /dev/sda1 /tmp/storage_test0ncq9mm1lukskey ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-77ccb785-87b2-452e-b620-a3b7571d55bc ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-77ccb785-87b2-452e-b620-a3b7571d55bc ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d77ccb785\\x2d87b2\\x2d452e\\x2db620\\x2da3b7571d55bc.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d77ccb785\\x2d87b2\\x2d452e\\x2db620\\x2da3b7571d55bc.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d77ccb785\\\\x2d87b2\\\\x2d452e\\\\x2db620\\\\x2da3b7571d55bc.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "-.mount \"system-systemd\\\\x2dcryptsetup.slice\"", "RequiresMountsFor": "/tmp/storage_test0ncq9mm1lukskey", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2026-01-18 02:00:14 EST", "StateChangeTimestampMonotonic": "11192806828", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d77ccb785\\\\x2d87b2\\\\x2d452e\\\\x2db620\\\\x2da3b7571d55bc.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 18 January 2026 02:02:22 -0500 (0:00:01.742) 0:17:43.325 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 18 January 2026 02:02:25 -0500 (0:00:02.891) 0:17:46.217 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 18 January 2026 02:02:25 -0500 (0:00:00.504) 0:17:46.722 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719611.9992316, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "9f4f6fedb74e59f8caaae3da0a3adca4e6bbf621", "ctime": 1768719611.9952316, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 574619848, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1768719611.9952316, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1490, "uid": 0, "version": "1014542963", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 18 January 2026 02:02:26 -0500 (0:00:01.335) 0:17:48.057 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "blivet_output is changed", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 18 January 2026 02:02:27 -0500 (0:00:00.336) 0:17:48.394 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2d77ccb785\x2d87b2\x2d452e\x2db620\x2da3b7571d55bc.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d77ccb785\\x2d87b2\\x2d452e\\x2db620\\x2da3b7571d55bc.service", "name": "systemd-cryptsetup@luks\\x2d77ccb785\\x2d87b2\\x2d452e\\x2db620\\x2da3b7571d55bc.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d77ccb785\\x2d87b2\\x2d452e\\x2db620\\x2da3b7571d55bc.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d77ccb785\\x2d87b2\\x2d452e\\x2db620\\x2da3b7571d55bc.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d77ccb785\\x2d87b2\\x2d452e\\x2db620\\x2da3b7571d55bc.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454172672", "LimitMEMLOCKSoft": "454172672", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d77ccb785\\x2d87b2\\x2d452e\\x2db620\\x2da3b7571d55bc.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d77ccb785\\\\x2d87b2\\\\x2d452e\\\\x2db620\\\\x2da3b7571d55bc.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 18 January 2026 02:02:28 -0500 (0:00:01.682) 0:17:50.077 ******** ok: [managed-node9] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 18 January 2026 02:02:29 -0500 (0:00:00.539) 0:17:50.617 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 18 January 2026 02:02:29 -0500 (0:00:00.283) 0:17:50.900 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 18 January 2026 02:02:29 -0500 (0:00:00.259) 0:17:51.160 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 18 January 2026 02:02:30 -0500 (0:00:00.639) 0:17:51.799 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 18 January 2026 02:02:32 -0500 (0:00:01.925) 0:17:53.742 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount ok: [managed-node9] => (item={'src': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 18 January 2026 02:02:34 -0500 (0:00:01.905) 0:17:55.648 ******** skipping: [managed-node9] => (item={'src': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 18 January 2026 02:02:35 -0500 (0:00:00.721) 0:17:56.369 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 18 January 2026 02:02:37 -0500 (0:00:01.992) 0:17:58.362 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719625.6042953, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "026fcf0376d68cdd5587a5977573214f09382627", "ctime": 1768719618.1782606, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 171966665, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1768719618.1789842, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "2976265988", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 18 January 2026 02:02:38 -0500 (0:00:01.296) 0:17:59.659 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 18 January 2026 02:02:38 -0500 (0:00:00.236) 0:17:59.895 ******** ok: [managed-node9] TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:413 Sunday 18 January 2026 02:02:40 -0500 (0:00:02.238) 0:18:02.134 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify role results - 8] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:420 Sunday 18 January 2026 02:02:41 -0500 (0:00:00.311) 0:18:02.445 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node9 TASK [Print out pool information] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Sunday 18 January 2026 02:02:41 -0500 (0:00:00.596) 0:18:03.042 ******** ok: [managed-node9] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Sunday 18 January 2026 02:02:42 -0500 (0:00:00.665) 0:18:03.708 ******** skipping: [managed-node9] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Sunday 18 January 2026 02:02:43 -0500 (0:00:00.582) 0:18:04.290 ******** ok: [managed-node9] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "624bac9e-643d-439d-b06b-2978f908f664" }, "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "size": "4G", "type": "crypt", "uuid": "24a74452-32f4-49ab-864a-37839714674e" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "nF8Jg7-ZUfV-jV9Y-MYu6-Z3xj-fKdT-Gu9qOl" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "a072e5ad-b1ec-4900-abeb-5d4679ecfcd5" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Sunday 18 January 2026 02:02:44 -0500 (0:00:01.365) 0:18:05.655 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003120", "end": "2026-01-18 02:02:45.469745", "rc": 0, "start": "2026-01-18 02:02:45.466625" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Mon Jan 5 14:46:37 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=a072e5ad-b1ec-4900-abeb-5d4679ecfcd5 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Sunday 18 January 2026 02:02:45 -0500 (0:00:01.256) 0:18:06.911 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003188", "end": "2026-01-18 02:02:46.671622", "failed_when_result": false, "rc": 0, "start": "2026-01-18 02:02:46.668434" } STDOUT: luks-624bac9e-643d-439d-b06b-2978f908f664 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Sunday 18 January 2026 02:02:46 -0500 (0:00:01.118) 0:18:08.030 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node9 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Sunday 18 January 2026 02:02:47 -0500 (0:00:01.050) 0:18:09.080 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Sunday 18 January 2026 02:02:48 -0500 (0:00:00.321) 0:18:09.402 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.029450", "end": "2026-01-18 02:02:49.454503", "rc": 0, "start": "2026-01-18 02:02:49.425053" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Sunday 18 January 2026 02:02:49 -0500 (0:00:01.406) 0:18:10.809 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Sunday 18 January 2026 02:02:49 -0500 (0:00:00.352) 0:18:11.161 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node9 => (item=members) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node9 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Sunday 18 January 2026 02:02:50 -0500 (0:00:00.513) 0:18:11.675 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Sunday 18 January 2026 02:02:51 -0500 (0:00:00.890) 0:18:12.566 ******** ok: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Sunday 18 January 2026 02:02:52 -0500 (0:00:01.279) 0:18:13.845 ******** ok: [managed-node9] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Sunday 18 January 2026 02:02:53 -0500 (0:00:00.765) 0:18:14.611 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Sunday 18 January 2026 02:02:54 -0500 (0:00:00.643) 0:18:15.254 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Sunday 18 January 2026 02:02:54 -0500 (0:00:00.725) 0:18:15.979 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Sunday 18 January 2026 02:02:55 -0500 (0:00:00.343) 0:18:16.322 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Sunday 18 January 2026 02:02:55 -0500 (0:00:00.785) 0:18:17.108 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Sunday 18 January 2026 02:02:56 -0500 (0:00:00.395) 0:18:17.503 ******** ok: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Sunday 18 January 2026 02:02:56 -0500 (0:00:00.360) 0:18:17.864 ******** ok: [managed-node9] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:113041): WARNING **: 02:02:57.764: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.1 1 Jul 2025 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.176 originally 10.31.45.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.176 originally 10.31.45.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bb59fe80f9' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.176 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Sunday 18 January 2026 02:02:58 -0500 (0:00:01.348) 0:18:19.213 ******** skipping: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "storage_test_pool.grow_to_fill | bool", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Sunday 18 January 2026 02:02:58 -0500 (0:00:00.592) 0:18:19.806 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node9 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Sunday 18 January 2026 02:02:59 -0500 (0:00:00.577) 0:18:20.383 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Sunday 18 January 2026 02:02:59 -0500 (0:00:00.214) 0:18:20.597 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Sunday 18 January 2026 02:02:59 -0500 (0:00:00.159) 0:18:20.757 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Sunday 18 January 2026 02:02:59 -0500 (0:00:00.272) 0:18:21.030 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Sunday 18 January 2026 02:03:00 -0500 (0:00:00.201) 0:18:21.232 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Sunday 18 January 2026 02:03:00 -0500 (0:00:00.264) 0:18:21.497 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Sunday 18 January 2026 02:03:00 -0500 (0:00:00.243) 0:18:21.740 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Sunday 18 January 2026 02:03:00 -0500 (0:00:00.235) 0:18:21.976 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Sunday 18 January 2026 02:03:01 -0500 (0:00:00.244) 0:18:22.220 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Sunday 18 January 2026 02:03:01 -0500 (0:00:00.210) 0:18:22.430 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Sunday 18 January 2026 02:03:01 -0500 (0:00:00.227) 0:18:22.658 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Sunday 18 January 2026 02:03:01 -0500 (0:00:00.261) 0:18:22.920 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node9 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Sunday 18 January 2026 02:03:02 -0500 (0:00:00.727) 0:18:23.647 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node9 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Sunday 18 January 2026 02:03:02 -0500 (0:00:00.509) 0:18:24.156 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Sunday 18 January 2026 02:03:03 -0500 (0:00:00.305) 0:18:24.461 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Sunday 18 January 2026 02:03:03 -0500 (0:00:00.334) 0:18:24.796 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Sunday 18 January 2026 02:03:03 -0500 (0:00:00.302) 0:18:25.098 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Sunday 18 January 2026 02:03:04 -0500 (0:00:00.242) 0:18:25.341 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Sunday 18 January 2026 02:03:04 -0500 (0:00:00.267) 0:18:25.608 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Sunday 18 January 2026 02:03:04 -0500 (0:00:00.309) 0:18:25.917 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Sunday 18 January 2026 02:03:05 -0500 (0:00:00.368) 0:18:26.286 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node9 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Sunday 18 January 2026 02:03:05 -0500 (0:00:00.607) 0:18:26.893 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node9 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Sunday 18 January 2026 02:03:06 -0500 (0:00:00.495) 0:18:27.388 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Sunday 18 January 2026 02:03:06 -0500 (0:00:00.259) 0:18:27.648 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Sunday 18 January 2026 02:03:06 -0500 (0:00:00.231) 0:18:27.880 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Sunday 18 January 2026 02:03:06 -0500 (0:00:00.231) 0:18:28.112 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Sunday 18 January 2026 02:03:07 -0500 (0:00:00.190) 0:18:28.302 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node9 TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Sunday 18 January 2026 02:03:07 -0500 (0:00:00.578) 0:18:28.881 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Sunday 18 January 2026 02:03:09 -0500 (0:00:01.928) 0:18:30.809 ******** skipping: [managed-node9] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Sunday 18 January 2026 02:03:11 -0500 (0:00:01.477) 0:18:32.287 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node9 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Sunday 18 January 2026 02:03:11 -0500 (0:00:00.466) 0:18:32.753 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Sunday 18 January 2026 02:03:12 -0500 (0:00:00.633) 0:18:33.387 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Sunday 18 January 2026 02:03:12 -0500 (0:00:00.620) 0:18:34.008 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Sunday 18 January 2026 02:03:13 -0500 (0:00:00.606) 0:18:34.614 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Sunday 18 January 2026 02:03:13 -0500 (0:00:00.557) 0:18:35.172 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Sunday 18 January 2026 02:03:14 -0500 (0:00:00.742) 0:18:35.915 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Sunday 18 January 2026 02:03:14 -0500 (0:00:00.279) 0:18:36.194 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Sunday 18 January 2026 02:03:15 -0500 (0:00:00.185) 0:18:36.380 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node9 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Sunday 18 January 2026 02:03:15 -0500 (0:00:00.619) 0:18:37.000 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node9 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Sunday 18 January 2026 02:03:16 -0500 (0:00:00.489) 0:18:37.489 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Sunday 18 January 2026 02:03:16 -0500 (0:00:00.275) 0:18:37.765 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Sunday 18 January 2026 02:03:16 -0500 (0:00:00.272) 0:18:38.038 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Sunday 18 January 2026 02:03:17 -0500 (0:00:00.272) 0:18:38.310 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Sunday 18 January 2026 02:03:17 -0500 (0:00:00.275) 0:18:38.586 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Sunday 18 January 2026 02:03:17 -0500 (0:00:00.200) 0:18:38.787 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Sunday 18 January 2026 02:03:17 -0500 (0:00:00.188) 0:18:38.975 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Sunday 18 January 2026 02:03:17 -0500 (0:00:00.163) 0:18:39.139 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node9 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Sunday 18 January 2026 02:03:18 -0500 (0:00:00.860) 0:18:39.999 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Sunday 18 January 2026 02:03:18 -0500 (0:00:00.177) 0:18:40.177 ******** skipping: [managed-node9] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Sunday 18 January 2026 02:03:19 -0500 (0:00:00.183) 0:18:40.360 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Sunday 18 January 2026 02:03:19 -0500 (0:00:00.303) 0:18:40.664 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Sunday 18 January 2026 02:03:19 -0500 (0:00:00.193) 0:18:40.857 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Sunday 18 January 2026 02:03:19 -0500 (0:00:00.194) 0:18:41.052 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Sunday 18 January 2026 02:03:20 -0500 (0:00:00.190) 0:18:41.242 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Sunday 18 January 2026 02:03:20 -0500 (0:00:00.222) 0:18:41.465 ******** ok: [managed-node9] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Sunday 18 January 2026 02:03:20 -0500 (0:00:00.266) 0:18:41.731 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node9 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Sunday 18 January 2026 02:03:21 -0500 (0:00:00.507) 0:18:42.239 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Sunday 18 January 2026 02:03:21 -0500 (0:00:00.654) 0:18:42.894 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node9 => (item=mount) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node9 => (item=fstab) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node9 => (item=fs) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node9 => (item=device) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node9 => (item=encryption) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node9 => (item=md) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node9 => (item=size) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node9 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Sunday 18 January 2026 02:03:23 -0500 (0:00:01.868) 0:18:44.762 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Sunday 18 January 2026 02:03:24 -0500 (0:00:00.468) 0:18:45.231 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Sunday 18 January 2026 02:03:24 -0500 (0:00:00.496) 0:18:45.727 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Sunday 18 January 2026 02:03:25 -0500 (0:00:00.713) 0:18:46.440 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Sunday 18 January 2026 02:03:25 -0500 (0:00:00.282) 0:18:46.723 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Sunday 18 January 2026 02:03:25 -0500 (0:00:00.442) 0:18:47.165 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Sunday 18 January 2026 02:03:26 -0500 (0:00:00.769) 0:18:47.935 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Sunday 18 January 2026 02:03:27 -0500 (0:00:00.644) 0:18:48.579 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Sunday 18 January 2026 02:03:27 -0500 (0:00:00.244) 0:18:48.823 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Sunday 18 January 2026 02:03:27 -0500 (0:00:00.269) 0:18:49.093 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Sunday 18 January 2026 02:03:28 -0500 (0:00:00.266) 0:18:49.360 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Sunday 18 January 2026 02:03:28 -0500 (0:00:00.242) 0:18:49.602 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Sunday 18 January 2026 02:03:29 -0500 (0:00:01.091) 0:18:50.694 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Sunday 18 January 2026 02:03:30 -0500 (0:00:00.563) 0:18:51.258 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Sunday 18 January 2026 02:03:30 -0500 (0:00:00.591) 0:18:51.849 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Sunday 18 January 2026 02:03:31 -0500 (0:00:00.576) 0:18:52.426 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Sunday 18 January 2026 02:03:31 -0500 (0:00:00.548) 0:18:52.975 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Sunday 18 January 2026 02:03:32 -0500 (0:00:00.297) 0:18:53.272 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Sunday 18 January 2026 02:03:32 -0500 (0:00:00.523) 0:18:53.796 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Sunday 18 January 2026 02:03:33 -0500 (0:00:00.569) 0:18:54.366 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719677.4395375, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1768719600.4861777, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1928, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1768719600.4861777, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Sunday 18 January 2026 02:03:34 -0500 (0:00:01.305) 0:18:55.671 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Sunday 18 January 2026 02:03:34 -0500 (0:00:00.331) 0:18:56.003 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Sunday 18 January 2026 02:03:35 -0500 (0:00:00.279) 0:18:56.282 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Sunday 18 January 2026 02:03:35 -0500 (0:00:00.321) 0:18:56.604 ******** ok: [managed-node9] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Sunday 18 January 2026 02:03:35 -0500 (0:00:00.337) 0:18:56.942 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Sunday 18 January 2026 02:03:35 -0500 (0:00:00.251) 0:18:57.193 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Sunday 18 January 2026 02:03:36 -0500 (0:00:00.340) 0:18:57.533 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719744.7328522, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1768719600.95618, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1976, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1768719600.95618, "nlink": 1, "path": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Sunday 18 January 2026 02:03:37 -0500 (0:00:01.284) 0:18:58.818 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Sunday 18 January 2026 02:03:39 -0500 (0:00:02.056) 0:19:00.888 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.006700", "end": "2026-01-18 02:03:40.656068", "rc": 0, "start": "2026-01-18 02:03:40.649368" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 16384 MK bits: 512 MK digest: 0d 79 05 81 fd 12 aa b4 ac 3b 42 74 05 f7 7e 33 3b 56 cb 7b MK salt: 15 75 e6 76 8a ce 29 95 4c 21 c5 20 6b 45 4e 89 43 d5 82 80 0e b9 1f d9 e0 df 97 16 46 10 55 0e MK iterations: 133474 UUID: 624bac9e-643d-439d-b06b-2978f908f664 Key Slot 0: ENABLED Iterations: 2135592 Salt: d6 58 a8 f2 f2 67 00 cc 38 8b 09 13 a7 3c e5 7d b4 31 a3 6d b6 9c 0a 7b d8 1f 0c f1 9b 7b 48 67 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Sunday 18 January 2026 02:03:40 -0500 (0:00:01.140) 0:19:02.028 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Sunday 18 January 2026 02:03:41 -0500 (0:00:00.716) 0:19:02.745 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Sunday 18 January 2026 02:03:42 -0500 (0:00:00.640) 0:19:03.385 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Sunday 18 January 2026 02:03:42 -0500 (0:00:00.358) 0:19:03.744 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Sunday 18 January 2026 02:03:42 -0500 (0:00:00.332) 0:19:04.076 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Sunday 18 January 2026 02:03:43 -0500 (0:00:00.876) 0:19:04.953 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.encryption_key_size > 0", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Sunday 18 January 2026 02:03:44 -0500 (0:00:00.259) 0:19:05.212 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Sunday 18 January 2026 02:03:44 -0500 (0:00:00.381) 0:19:05.593 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-624bac9e-643d-439d-b06b-2978f908f664 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Sunday 18 January 2026 02:03:45 -0500 (0:00:00.816) 0:19:06.410 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Sunday 18 January 2026 02:03:45 -0500 (0:00:00.662) 0:19:07.073 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Sunday 18 January 2026 02:03:46 -0500 (0:00:00.712) 0:19:07.785 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Sunday 18 January 2026 02:03:47 -0500 (0:00:00.738) 0:19:08.523 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Sunday 18 January 2026 02:03:48 -0500 (0:00:00.728) 0:19:09.252 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Sunday 18 January 2026 02:03:48 -0500 (0:00:00.304) 0:19:09.557 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Sunday 18 January 2026 02:03:48 -0500 (0:00:00.244) 0:19:09.801 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Sunday 18 January 2026 02:03:48 -0500 (0:00:00.248) 0:19:10.050 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Sunday 18 January 2026 02:03:49 -0500 (0:00:00.267) 0:19:10.317 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Sunday 18 January 2026 02:03:49 -0500 (0:00:00.244) 0:19:10.561 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Sunday 18 January 2026 02:03:49 -0500 (0:00:00.270) 0:19:10.831 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Sunday 18 January 2026 02:03:49 -0500 (0:00:00.289) 0:19:11.121 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Sunday 18 January 2026 02:03:50 -0500 (0:00:00.207) 0:19:11.328 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Sunday 18 January 2026 02:03:50 -0500 (0:00:00.220) 0:19:11.548 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Sunday 18 January 2026 02:03:50 -0500 (0:00:00.247) 0:19:11.796 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Sunday 18 January 2026 02:03:50 -0500 (0:00:00.183) 0:19:11.980 ******** ok: [managed-node9] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Sunday 18 January 2026 02:03:52 -0500 (0:00:01.552) 0:19:13.533 ******** ok: [managed-node9] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Sunday 18 January 2026 02:03:54 -0500 (0:00:01.753) 0:19:15.286 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Sunday 18 January 2026 02:03:54 -0500 (0:00:00.658) 0:19:15.944 ******** ok: [managed-node9] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Sunday 18 January 2026 02:03:55 -0500 (0:00:00.303) 0:19:16.248 ******** ok: [managed-node9] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Sunday 18 January 2026 02:03:56 -0500 (0:00:01.693) 0:19:17.942 ******** skipping: [managed-node9] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Sunday 18 January 2026 02:03:57 -0500 (0:00:00.720) 0:19:18.662 ******** skipping: [managed-node9] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Sunday 18 January 2026 02:03:58 -0500 (0:00:00.743) 0:19:19.406 ******** skipping: [managed-node9] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Sunday 18 January 2026 02:03:58 -0500 (0:00:00.701) 0:19:20.107 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Sunday 18 January 2026 02:03:59 -0500 (0:00:00.614) 0:19:20.722 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Sunday 18 January 2026 02:04:00 -0500 (0:00:00.655) 0:19:21.377 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Sunday 18 January 2026 02:04:00 -0500 (0:00:00.630) 0:19:22.008 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Sunday 18 January 2026 02:04:01 -0500 (0:00:00.679) 0:19:22.687 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Sunday 18 January 2026 02:04:02 -0500 (0:00:00.746) 0:19:23.433 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Sunday 18 January 2026 02:04:02 -0500 (0:00:00.664) 0:19:24.097 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Sunday 18 January 2026 02:04:03 -0500 (0:00:00.692) 0:19:24.790 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Sunday 18 January 2026 02:04:04 -0500 (0:00:00.647) 0:19:25.438 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Sunday 18 January 2026 02:04:04 -0500 (0:00:00.641) 0:19:26.079 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Sunday 18 January 2026 02:04:05 -0500 (0:00:00.701) 0:19:26.781 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Sunday 18 January 2026 02:04:06 -0500 (0:00:00.867) 0:19:27.648 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Sunday 18 January 2026 02:04:07 -0500 (0:00:00.812) 0:19:28.461 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Sunday 18 January 2026 02:04:07 -0500 (0:00:00.739) 0:19:29.200 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Sunday 18 January 2026 02:04:08 -0500 (0:00:00.682) 0:19:29.883 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Sunday 18 January 2026 02:04:09 -0500 (0:00:00.633) 0:19:30.516 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Sunday 18 January 2026 02:04:09 -0500 (0:00:00.643) 0:19:31.160 ******** ok: [managed-node9] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Sunday 18 January 2026 02:04:10 -0500 (0:00:00.320) 0:19:31.480 ******** ok: [managed-node9] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Sunday 18 January 2026 02:04:10 -0500 (0:00:00.255) 0:19:31.735 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Sunday 18 January 2026 02:04:11 -0500 (0:00:00.703) 0:19:32.439 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.029996", "end": "2026-01-18 02:04:12.403492", "rc": 0, "start": "2026-01-18 02:04:12.373496" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Sunday 18 January 2026 02:04:12 -0500 (0:00:01.378) 0:19:33.817 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Sunday 18 January 2026 02:04:13 -0500 (0:00:00.584) 0:19:34.402 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Sunday 18 January 2026 02:04:13 -0500 (0:00:00.669) 0:19:35.071 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Sunday 18 January 2026 02:04:14 -0500 (0:00:00.694) 0:19:35.765 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Sunday 18 January 2026 02:04:15 -0500 (0:00:00.678) 0:19:36.444 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Sunday 18 January 2026 02:04:15 -0500 (0:00:00.605) 0:19:37.050 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Sunday 18 January 2026 02:04:16 -0500 (0:00:00.624) 0:19:37.675 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Sunday 18 January 2026 02:04:16 -0500 (0:00:00.338) 0:19:38.014 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Sunday 18 January 2026 02:04:17 -0500 (0:00:00.593) 0:19:38.607 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Sunday 18 January 2026 02:04:17 -0500 (0:00:00.192) 0:19:38.799 ******** changed: [managed-node9] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 5] ****************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:426 Sunday 18 January 2026 02:04:18 -0500 (0:00:01.204) 0:19:40.004 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node9 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Sunday 18 January 2026 02:04:19 -0500 (0:00:00.576) 0:19:40.580 ******** ok: [managed-node9] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Sunday 18 January 2026 02:04:20 -0500 (0:00:00.720) 0:19:41.300 ******** included: fedora.linux_system_roles.storage for managed-node9 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 18 January 2026 02:04:20 -0500 (0:00:00.552) 0:19:41.853 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 18 January 2026 02:04:21 -0500 (0:00:00.398) 0:19:42.252 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 18 January 2026 02:04:21 -0500 (0:00:00.608) 0:19:42.861 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 18 January 2026 02:04:22 -0500 (0:00:00.684) 0:19:43.545 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 18 January 2026 02:04:22 -0500 (0:00:00.321) 0:19:43.867 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 18 January 2026 02:04:22 -0500 (0:00:00.261) 0:19:44.128 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 18 January 2026 02:04:23 -0500 (0:00:00.251) 0:19:44.380 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 18 January 2026 02:04:23 -0500 (0:00:00.266) 0:19:44.646 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 18 January 2026 02:04:24 -0500 (0:00:00.838) 0:19:45.484 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 18 January 2026 02:04:26 -0500 (0:00:02.445) 0:19:47.929 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 18 January 2026 02:04:27 -0500 (0:00:00.734) 0:19:48.664 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 18 January 2026 02:04:28 -0500 (0:00:00.679) 0:19:49.344 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 18 January 2026 02:04:30 -0500 (0:00:02.864) 0:19:52.208 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 18 January 2026 02:04:31 -0500 (0:00:00.647) 0:19:52.856 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 18 January 2026 02:04:32 -0500 (0:00:00.575) 0:19:53.432 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 18 January 2026 02:04:33 -0500 (0:00:00.781) 0:19:54.213 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 18 January 2026 02:04:33 -0500 (0:00:00.786) 0:19:55.000 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 18 January 2026 02:04:36 -0500 (0:00:02.217) 0:19:57.230 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service": { "name": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 18 January 2026 02:04:39 -0500 (0:00:03.184) 0:20:00.415 ******** ok: [managed-node9] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 18 January 2026 02:04:40 -0500 (0:00:00.858) 0:20:01.273 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2d624bac9e\x2d643d\x2d439d\x2db06b\x2d2978f908f664.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "name": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket \"system-systemd\\\\x2dcryptsetup.slice\" systemd-udevd-kernel.socket cryptsetup-pre.target \"dev-mapper-foo\\\\x2dtest1.device\"", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target \"blockdev@dev-mapper-luks\\\\x2d624bac9e\\\\x2d643d\\\\x2d439d\\\\x2db06b\\\\x2d2978f908f664.target\"", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-624bac9e-643d-439d-b06b-2978f908f664", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-624bac9e-643d-439d-b06b-2978f908f664 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-624bac9e-643d-439d-b06b-2978f908f664 /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-624bac9e-643d-439d-b06b-2978f908f664 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-624bac9e-643d-439d-b06b-2978f908f664 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d624bac9e\\\\x2d643d\\\\x2d439d\\\\x2db06b\\\\x2d2978f908f664.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target \"dev-mapper-luks\\\\x2d624bac9e\\\\x2d643d\\\\x2d439d\\\\x2db06b\\\\x2d2978f908f664.device\"", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2026-01-18 02:02:28 EST", "StateChangeTimestampMonotonic": "11326987203", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d624bac9e\\\\x2d643d\\\\x2d439d\\\\x2db06b\\\\x2d2978f908f664.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 18 January 2026 02:04:41 -0500 (0:00:01.713) 0:20:02.987 ******** fatal: [managed-node9]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-624bac9e-643d-439d-b06b-2978f908f664' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Sunday 18 January 2026 02:04:44 -0500 (0:00:02.953) 0:20:05.941 ******** fatal: [managed-node9]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'luks-624bac9e-643d-439d-b06b-2978f908f664' in safe mode due to encryption removal", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 18 January 2026 02:04:45 -0500 (0:00:00.341) 0:20:06.282 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2d624bac9e\x2d643d\x2d439d\x2db06b\x2d2978f908f664.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "name": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454172672", "LimitMEMLOCKSoft": "454172672", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d624bac9e\\\\x2d643d\\\\x2d439d\\\\x2db06b\\\\x2d2978f908f664.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target \"dev-mapper-luks\\\\x2d624bac9e\\\\x2d643d\\\\x2d439d\\\\x2db06b\\\\x2d2978f908f664.device\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2026-01-18 02:02:28 EST", "StateChangeTimestampMonotonic": "11326987203", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Sunday 18 January 2026 02:04:46 -0500 (0:00:01.659) 0:20:07.941 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Sunday 18 January 2026 02:04:47 -0500 (0:00:00.268) 0:20:08.210 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Sunday 18 January 2026 02:04:47 -0500 (0:00:00.385) 0:20:08.595 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Sunday 18 January 2026 02:04:47 -0500 (0:00:00.214) 0:20:08.810 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719858.6473846, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1768719858.6473846, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1768719858.6473846, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2560796655", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Sunday 18 January 2026 02:04:48 -0500 (0:00:01.308) 0:20:10.118 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 3] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:449 Sunday 18 January 2026 02:04:49 -0500 (0:00:00.485) 0:20:10.603 ******** included: fedora.linux_system_roles.storage for managed-node9 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 18 January 2026 02:04:51 -0500 (0:00:02.440) 0:20:13.043 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 18 January 2026 02:04:52 -0500 (0:00:00.450) 0:20:13.493 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 18 January 2026 02:04:53 -0500 (0:00:00.734) 0:20:14.228 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 18 January 2026 02:04:53 -0500 (0:00:00.806) 0:20:15.043 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 18 January 2026 02:04:54 -0500 (0:00:00.333) 0:20:15.377 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 18 January 2026 02:04:54 -0500 (0:00:00.310) 0:20:15.688 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 18 January 2026 02:04:54 -0500 (0:00:00.247) 0:20:15.936 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 18 January 2026 02:04:54 -0500 (0:00:00.263) 0:20:16.199 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 18 January 2026 02:04:55 -0500 (0:00:00.876) 0:20:17.076 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 18 January 2026 02:04:58 -0500 (0:00:02.224) 0:20:19.300 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 18 January 2026 02:04:58 -0500 (0:00:00.585) 0:20:19.886 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 18 January 2026 02:04:59 -0500 (0:00:00.698) 0:20:20.585 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 18 January 2026 02:05:02 -0500 (0:00:02.881) 0:20:23.466 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 18 January 2026 02:05:02 -0500 (0:00:00.574) 0:20:24.041 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 18 January 2026 02:05:03 -0500 (0:00:00.595) 0:20:24.637 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 18 January 2026 02:05:04 -0500 (0:00:00.687) 0:20:25.324 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 18 January 2026 02:05:04 -0500 (0:00:00.590) 0:20:25.914 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 18 January 2026 02:05:07 -0500 (0:00:02.303) 0:20:28.218 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service": { "name": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 18 January 2026 02:05:10 -0500 (0:00:03.054) 0:20:31.272 ******** ok: [managed-node9] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 18 January 2026 02:05:11 -0500 (0:00:00.940) 0:20:32.213 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2d624bac9e\x2d643d\x2d439d\x2db06b\x2d2978f908f664.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "name": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "\"system-systemd\\\\x2dcryptsetup.slice\" cryptsetup-pre.target systemd-udevd-kernel.socket \"dev-mapper-foo\\\\x2dtest1.device\" systemd-journald.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target \"blockdev@dev-mapper-luks\\\\x2d624bac9e\\\\x2d643d\\\\x2d439d\\\\x2db06b\\\\x2d2978f908f664.target\" umount.target", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-624bac9e-643d-439d-b06b-2978f908f664", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-624bac9e-643d-439d-b06b-2978f908f664 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-624bac9e-643d-439d-b06b-2978f908f664 /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-624bac9e-643d-439d-b06b-2978f908f664 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-624bac9e-643d-439d-b06b-2978f908f664 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d624bac9e\\\\x2d643d\\\\x2d439d\\\\x2db06b\\\\x2d2978f908f664.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target \"dev-mapper-luks\\\\x2d624bac9e\\\\x2d643d\\\\x2d439d\\\\x2db06b\\\\x2d2978f908f664.device\"", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2026-01-18 02:02:28 EST", "StateChangeTimestampMonotonic": "11326987203", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d624bac9e\\\\x2d643d\\\\x2d439d\\\\x2db06b\\\\x2d2978f908f664.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 18 January 2026 02:05:12 -0500 (0:00:01.640) 0:20:33.854 ******** changed: [managed-node9] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-624bac9e-643d-439d-b06b-2978f908f664", "password": "-", "state": "absent" } ], "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 18 January 2026 02:05:16 -0500 (0:00:03.961) 0:20:37.815 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 18 January 2026 02:05:17 -0500 (0:00:00.646) 0:20:38.462 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719611.9992316, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "9f4f6fedb74e59f8caaae3da0a3adca4e6bbf621", "ctime": 1768719611.9952316, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 574619848, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1768719611.9952316, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1490, "uid": 0, "version": "1014542963", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 18 January 2026 02:05:18 -0500 (0:00:01.311) 0:20:39.774 ******** ok: [managed-node9] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 18 January 2026 02:05:20 -0500 (0:00:01.500) 0:20:41.274 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2d624bac9e\x2d643d\x2d439d\x2db06b\x2d2978f908f664.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "name": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454172672", "LimitMEMLOCKSoft": "454172672", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d624bac9e\\\\x2d643d\\\\x2d439d\\\\x2db06b\\\\x2d2978f908f664.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "\"dev-mapper-luks\\\\x2d624bac9e\\\\x2d643d\\\\x2d439d\\\\x2db06b\\\\x2d2978f908f664.device\" cryptsetup.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2026-01-18 02:02:28 EST", "StateChangeTimestampMonotonic": "11326987203", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 18 January 2026 02:05:21 -0500 (0:00:01.632) 0:20:42.906 ******** ok: [managed-node9] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-624bac9e-643d-439d-b06b-2978f908f664", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 18 January 2026 02:05:22 -0500 (0:00:00.374) 0:20:43.281 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 18 January 2026 02:05:22 -0500 (0:00:00.313) 0:20:43.595 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 18 January 2026 02:05:22 -0500 (0:00:00.274) 0:20:43.870 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node9] => (item={'src': '/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-624bac9e-643d-439d-b06b-2978f908f664" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 18 January 2026 02:05:24 -0500 (0:00:01.682) 0:20:45.553 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 18 January 2026 02:05:26 -0500 (0:00:01.959) 0:20:47.512 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node9] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 18 January 2026 02:05:28 -0500 (0:00:01.989) 0:20:49.502 ******** skipping: [managed-node9] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 18 January 2026 02:05:29 -0500 (0:00:00.803) 0:20:50.305 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 18 January 2026 02:05:31 -0500 (0:00:02.356) 0:20:52.661 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719625.6042953, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "026fcf0376d68cdd5587a5977573214f09382627", "ctime": 1768719618.1782606, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 171966665, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1768719618.1789842, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "2976265988", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 18 January 2026 02:05:32 -0500 (0:00:01.268) 0:20:53.930 ******** changed: [managed-node9] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-624bac9e-643d-439d-b06b-2978f908f664', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-624bac9e-643d-439d-b06b-2978f908f664", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 18 January 2026 02:05:34 -0500 (0:00:01.878) 0:20:55.809 ******** ok: [managed-node9] TASK [Verify role results - 9] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:465 Sunday 18 January 2026 02:05:36 -0500 (0:00:02.069) 0:20:57.879 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node9 TASK [Print out pool information] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Sunday 18 January 2026 02:05:37 -0500 (0:00:00.695) 0:20:58.574 ******** ok: [managed-node9] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Sunday 18 January 2026 02:05:38 -0500 (0:00:00.865) 0:20:59.440 ******** skipping: [managed-node9] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Sunday 18 January 2026 02:05:38 -0500 (0:00:00.721) 0:21:00.162 ******** ok: [managed-node9] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "b6bb4240-07ec-4883-9b5c-8d3638d52e9a" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "nF8Jg7-ZUfV-jV9Y-MYu6-Z3xj-fKdT-Gu9qOl" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "a072e5ad-b1ec-4900-abeb-5d4679ecfcd5" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Sunday 18 January 2026 02:05:40 -0500 (0:00:01.271) 0:21:01.434 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003445", "end": "2026-01-18 02:05:41.236234", "rc": 0, "start": "2026-01-18 02:05:41.232789" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Mon Jan 5 14:46:37 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=a072e5ad-b1ec-4900-abeb-5d4679ecfcd5 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Sunday 18 January 2026 02:05:41 -0500 (0:00:01.256) 0:21:02.691 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003081", "end": "2026-01-18 02:05:42.492810", "failed_when_result": false, "rc": 0, "start": "2026-01-18 02:05:42.489729" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Sunday 18 January 2026 02:05:42 -0500 (0:00:01.179) 0:21:03.870 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node9 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Sunday 18 January 2026 02:05:43 -0500 (0:00:00.877) 0:21:04.748 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Sunday 18 January 2026 02:05:43 -0500 (0:00:00.240) 0:21:04.989 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.029675", "end": "2026-01-18 02:05:44.972501", "rc": 0, "start": "2026-01-18 02:05:44.942826" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Sunday 18 January 2026 02:05:45 -0500 (0:00:01.382) 0:21:06.371 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Sunday 18 January 2026 02:05:45 -0500 (0:00:00.385) 0:21:06.757 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node9 => (item=members) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node9 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Sunday 18 January 2026 02:05:46 -0500 (0:00:00.654) 0:21:07.411 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Sunday 18 January 2026 02:05:46 -0500 (0:00:00.781) 0:21:08.193 ******** ok: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Sunday 18 January 2026 02:05:48 -0500 (0:00:01.190) 0:21:09.384 ******** ok: [managed-node9] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Sunday 18 January 2026 02:05:48 -0500 (0:00:00.625) 0:21:10.009 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Sunday 18 January 2026 02:05:49 -0500 (0:00:00.648) 0:21:10.658 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Sunday 18 January 2026 02:05:50 -0500 (0:00:00.567) 0:21:11.225 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Sunday 18 January 2026 02:05:50 -0500 (0:00:00.274) 0:21:11.500 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Sunday 18 January 2026 02:05:50 -0500 (0:00:00.694) 0:21:12.194 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Sunday 18 January 2026 02:05:51 -0500 (0:00:00.289) 0:21:12.484 ******** ok: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Sunday 18 January 2026 02:05:51 -0500 (0:00:00.330) 0:21:12.814 ******** ok: [managed-node9] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:119560): WARNING **: 02:05:52.614: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.1 1 Jul 2025 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.176 originally 10.31.45.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.176 originally 10.31.45.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bb59fe80f9' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.176 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Sunday 18 January 2026 02:05:52 -0500 (0:00:01.234) 0:21:14.048 ******** skipping: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "storage_test_pool.grow_to_fill | bool", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Sunday 18 January 2026 02:05:53 -0500 (0:00:00.697) 0:21:14.746 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node9 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Sunday 18 January 2026 02:05:54 -0500 (0:00:00.597) 0:21:15.343 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Sunday 18 January 2026 02:05:54 -0500 (0:00:00.192) 0:21:15.535 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Sunday 18 January 2026 02:05:54 -0500 (0:00:00.240) 0:21:15.776 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Sunday 18 January 2026 02:05:54 -0500 (0:00:00.189) 0:21:15.966 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Sunday 18 January 2026 02:05:54 -0500 (0:00:00.218) 0:21:16.184 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Sunday 18 January 2026 02:05:55 -0500 (0:00:00.217) 0:21:16.402 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Sunday 18 January 2026 02:05:55 -0500 (0:00:00.240) 0:21:16.642 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Sunday 18 January 2026 02:05:55 -0500 (0:00:00.190) 0:21:16.844 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Sunday 18 January 2026 02:05:55 -0500 (0:00:00.188) 0:21:17.033 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Sunday 18 January 2026 02:05:56 -0500 (0:00:00.308) 0:21:17.342 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Sunday 18 January 2026 02:05:56 -0500 (0:00:00.222) 0:21:17.564 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Sunday 18 January 2026 02:05:56 -0500 (0:00:00.276) 0:21:17.841 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node9 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Sunday 18 January 2026 02:05:57 -0500 (0:00:00.507) 0:21:18.348 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node9 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Sunday 18 January 2026 02:05:57 -0500 (0:00:00.513) 0:21:18.862 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Sunday 18 January 2026 02:05:57 -0500 (0:00:00.312) 0:21:19.174 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Sunday 18 January 2026 02:05:58 -0500 (0:00:00.349) 0:21:19.523 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Sunday 18 January 2026 02:05:58 -0500 (0:00:00.244) 0:21:19.792 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Sunday 18 January 2026 02:05:58 -0500 (0:00:00.383) 0:21:20.176 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Sunday 18 January 2026 02:05:59 -0500 (0:00:00.269) 0:21:20.445 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Sunday 18 January 2026 02:05:59 -0500 (0:00:00.267) 0:21:20.727 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Sunday 18 January 2026 02:05:59 -0500 (0:00:00.215) 0:21:20.943 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node9 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Sunday 18 January 2026 02:06:00 -0500 (0:00:00.582) 0:21:21.526 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node9 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Sunday 18 January 2026 02:06:00 -0500 (0:00:00.416) 0:21:21.942 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Sunday 18 January 2026 02:06:00 -0500 (0:00:00.211) 0:21:22.154 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Sunday 18 January 2026 02:06:01 -0500 (0:00:00.231) 0:21:22.385 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Sunday 18 January 2026 02:06:01 -0500 (0:00:00.257) 0:21:22.642 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Sunday 18 January 2026 02:06:01 -0500 (0:00:00.246) 0:21:22.889 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node9 TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Sunday 18 January 2026 02:06:02 -0500 (0:00:00.716) 0:21:23.605 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Sunday 18 January 2026 02:06:02 -0500 (0:00:00.586) 0:21:24.192 ******** skipping: [managed-node9] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Sunday 18 January 2026 02:06:03 -0500 (0:00:00.268) 0:21:24.461 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node9 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Sunday 18 January 2026 02:06:03 -0500 (0:00:00.480) 0:21:24.942 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Sunday 18 January 2026 02:06:04 -0500 (0:00:00.592) 0:21:25.534 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Sunday 18 January 2026 02:06:05 -0500 (0:00:00.775) 0:21:26.309 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Sunday 18 January 2026 02:06:05 -0500 (0:00:00.449) 0:21:26.783 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Sunday 18 January 2026 02:06:05 -0500 (0:00:00.423) 0:21:27.207 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Sunday 18 January 2026 02:06:06 -0500 (0:00:00.587) 0:21:27.794 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Sunday 18 January 2026 02:06:06 -0500 (0:00:00.270) 0:21:28.065 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Sunday 18 January 2026 02:06:07 -0500 (0:00:00.274) 0:21:28.339 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node9 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Sunday 18 January 2026 02:06:07 -0500 (0:00:00.665) 0:21:29.005 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node9 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Sunday 18 January 2026 02:06:08 -0500 (0:00:00.512) 0:21:29.518 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Sunday 18 January 2026 02:06:08 -0500 (0:00:00.211) 0:21:29.730 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Sunday 18 January 2026 02:06:08 -0500 (0:00:00.216) 0:21:29.946 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Sunday 18 January 2026 02:06:08 -0500 (0:00:00.202) 0:21:30.149 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Sunday 18 January 2026 02:06:09 -0500 (0:00:00.218) 0:21:30.367 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Sunday 18 January 2026 02:06:09 -0500 (0:00:00.267) 0:21:30.634 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Sunday 18 January 2026 02:06:09 -0500 (0:00:00.223) 0:21:30.858 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Sunday 18 January 2026 02:06:09 -0500 (0:00:00.216) 0:21:31.075 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node9 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Sunday 18 January 2026 02:06:10 -0500 (0:00:00.767) 0:21:31.842 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Sunday 18 January 2026 02:06:10 -0500 (0:00:00.230) 0:21:32.073 ******** skipping: [managed-node9] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Sunday 18 January 2026 02:06:11 -0500 (0:00:00.258) 0:21:32.332 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Sunday 18 January 2026 02:06:11 -0500 (0:00:00.258) 0:21:32.590 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Sunday 18 January 2026 02:06:11 -0500 (0:00:00.249) 0:21:32.840 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Sunday 18 January 2026 02:06:11 -0500 (0:00:00.155) 0:21:32.995 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Sunday 18 January 2026 02:06:12 -0500 (0:00:00.276) 0:21:33.272 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Sunday 18 January 2026 02:06:12 -0500 (0:00:00.262) 0:21:33.534 ******** ok: [managed-node9] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Sunday 18 January 2026 02:06:12 -0500 (0:00:00.223) 0:21:33.757 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node9 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Sunday 18 January 2026 02:06:13 -0500 (0:00:00.483) 0:21:34.241 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Sunday 18 January 2026 02:06:13 -0500 (0:00:00.650) 0:21:34.891 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node9 => (item=mount) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node9 => (item=fstab) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node9 => (item=fs) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node9 => (item=device) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node9 => (item=encryption) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node9 => (item=md) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node9 => (item=size) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node9 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Sunday 18 January 2026 02:06:15 -0500 (0:00:01.957) 0:21:36.849 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Sunday 18 January 2026 02:06:15 -0500 (0:00:00.329) 0:21:37.179 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Sunday 18 January 2026 02:06:16 -0500 (0:00:00.690) 0:21:37.869 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Sunday 18 January 2026 02:06:17 -0500 (0:00:00.829) 0:21:38.698 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Sunday 18 January 2026 02:06:17 -0500 (0:00:00.336) 0:21:39.035 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Sunday 18 January 2026 02:06:18 -0500 (0:00:00.531) 0:21:39.567 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Sunday 18 January 2026 02:06:18 -0500 (0:00:00.564) 0:21:40.131 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Sunday 18 January 2026 02:06:21 -0500 (0:00:02.420) 0:21:42.552 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Sunday 18 January 2026 02:06:21 -0500 (0:00:00.259) 0:21:42.844 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Sunday 18 January 2026 02:06:23 -0500 (0:00:01.405) 0:21:44.249 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Sunday 18 January 2026 02:06:23 -0500 (0:00:00.199) 0:21:44.448 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Sunday 18 January 2026 02:06:23 -0500 (0:00:00.244) 0:21:44.692 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Sunday 18 January 2026 02:06:24 -0500 (0:00:00.982) 0:21:45.675 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Sunday 18 January 2026 02:06:25 -0500 (0:00:00.660) 0:21:46.335 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Sunday 18 January 2026 02:06:25 -0500 (0:00:00.565) 0:21:46.901 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Sunday 18 January 2026 02:06:26 -0500 (0:00:00.505) 0:21:47.406 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Sunday 18 January 2026 02:06:26 -0500 (0:00:00.549) 0:21:47.956 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Sunday 18 January 2026 02:06:26 -0500 (0:00:00.199) 0:21:48.156 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Sunday 18 January 2026 02:06:27 -0500 (0:00:00.877) 0:21:49.033 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Sunday 18 January 2026 02:06:28 -0500 (0:00:00.798) 0:21:49.831 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719916.3306544, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1768719916.3306544, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2064, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1768719916.3306544, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Sunday 18 January 2026 02:06:29 -0500 (0:00:01.222) 0:21:51.054 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Sunday 18 January 2026 02:06:30 -0500 (0:00:00.379) 0:21:51.433 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Sunday 18 January 2026 02:06:30 -0500 (0:00:00.311) 0:21:51.745 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Sunday 18 January 2026 02:06:30 -0500 (0:00:00.314) 0:21:52.060 ******** ok: [managed-node9] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Sunday 18 January 2026 02:06:31 -0500 (0:00:00.268) 0:21:52.328 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Sunday 18 January 2026 02:06:31 -0500 (0:00:00.194) 0:21:52.523 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Sunday 18 January 2026 02:06:31 -0500 (0:00:00.373) 0:21:52.897 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Sunday 18 January 2026 02:06:31 -0500 (0:00:00.279) 0:21:53.176 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Sunday 18 January 2026 02:06:34 -0500 (0:00:02.187) 0:21:55.364 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Sunday 18 January 2026 02:06:34 -0500 (0:00:00.243) 0:21:55.608 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Sunday 18 January 2026 02:06:34 -0500 (0:00:00.209) 0:21:55.817 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Sunday 18 January 2026 02:06:35 -0500 (0:00:00.764) 0:21:56.582 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Sunday 18 January 2026 02:06:35 -0500 (0:00:00.207) 0:21:56.789 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Sunday 18 January 2026 02:06:35 -0500 (0:00:00.301) 0:21:57.090 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Sunday 18 January 2026 02:06:36 -0500 (0:00:00.264) 0:21:57.355 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Sunday 18 January 2026 02:06:36 -0500 (0:00:00.256) 0:21:57.612 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Sunday 18 January 2026 02:06:36 -0500 (0:00:00.261) 0:21:57.873 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Sunday 18 January 2026 02:06:37 -0500 (0:00:00.837) 0:21:58.710 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Sunday 18 January 2026 02:06:38 -0500 (0:00:00.690) 0:21:59.401 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Sunday 18 January 2026 02:06:38 -0500 (0:00:00.493) 0:21:59.895 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Sunday 18 January 2026 02:06:39 -0500 (0:00:00.525) 0:22:00.420 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Sunday 18 January 2026 02:06:39 -0500 (0:00:00.545) 0:22:00.966 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Sunday 18 January 2026 02:06:40 -0500 (0:00:00.306) 0:22:01.272 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Sunday 18 January 2026 02:06:40 -0500 (0:00:00.233) 0:22:01.506 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Sunday 18 January 2026 02:06:40 -0500 (0:00:00.271) 0:22:01.777 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Sunday 18 January 2026 02:06:40 -0500 (0:00:00.292) 0:22:02.070 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Sunday 18 January 2026 02:06:41 -0500 (0:00:00.241) 0:22:02.312 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Sunday 18 January 2026 02:06:41 -0500 (0:00:00.189) 0:22:02.502 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Sunday 18 January 2026 02:06:41 -0500 (0:00:00.233) 0:22:02.735 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Sunday 18 January 2026 02:06:41 -0500 (0:00:00.216) 0:22:02.952 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Sunday 18 January 2026 02:06:41 -0500 (0:00:00.204) 0:22:03.156 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Sunday 18 January 2026 02:06:42 -0500 (0:00:00.265) 0:22:03.421 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Sunday 18 January 2026 02:06:42 -0500 (0:00:00.193) 0:22:03.615 ******** ok: [managed-node9] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Sunday 18 January 2026 02:06:43 -0500 (0:00:01.510) 0:22:05.125 ******** ok: [managed-node9] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Sunday 18 January 2026 02:06:45 -0500 (0:00:01.818) 0:22:06.943 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Sunday 18 January 2026 02:06:46 -0500 (0:00:00.754) 0:22:07.698 ******** ok: [managed-node9] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Sunday 18 January 2026 02:06:46 -0500 (0:00:00.194) 0:22:07.892 ******** ok: [managed-node9] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Sunday 18 January 2026 02:06:48 -0500 (0:00:01.621) 0:22:09.514 ******** skipping: [managed-node9] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Sunday 18 January 2026 02:06:48 -0500 (0:00:00.602) 0:22:10.116 ******** skipping: [managed-node9] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Sunday 18 January 2026 02:06:49 -0500 (0:00:00.668) 0:22:10.785 ******** skipping: [managed-node9] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Sunday 18 January 2026 02:06:50 -0500 (0:00:00.622) 0:22:11.407 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Sunday 18 January 2026 02:06:50 -0500 (0:00:00.700) 0:22:12.108 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Sunday 18 January 2026 02:06:51 -0500 (0:00:00.539) 0:22:12.648 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Sunday 18 January 2026 02:06:51 -0500 (0:00:00.503) 0:22:13.151 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Sunday 18 January 2026 02:06:52 -0500 (0:00:00.536) 0:22:13.688 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Sunday 18 January 2026 02:06:53 -0500 (0:00:00.608) 0:22:14.296 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Sunday 18 January 2026 02:06:53 -0500 (0:00:00.649) 0:22:14.946 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Sunday 18 January 2026 02:06:54 -0500 (0:00:00.701) 0:22:15.647 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Sunday 18 January 2026 02:06:55 -0500 (0:00:00.709) 0:22:16.357 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Sunday 18 January 2026 02:06:56 -0500 (0:00:00.899) 0:22:17.256 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Sunday 18 January 2026 02:06:56 -0500 (0:00:00.642) 0:22:17.899 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Sunday 18 January 2026 02:06:57 -0500 (0:00:00.830) 0:22:18.730 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Sunday 18 January 2026 02:06:58 -0500 (0:00:00.557) 0:22:19.287 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Sunday 18 January 2026 02:06:58 -0500 (0:00:00.652) 0:22:19.940 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Sunday 18 January 2026 02:06:59 -0500 (0:00:00.746) 0:22:20.687 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Sunday 18 January 2026 02:07:00 -0500 (0:00:00.535) 0:22:21.222 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Sunday 18 January 2026 02:07:00 -0500 (0:00:00.588) 0:22:21.810 ******** ok: [managed-node9] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Sunday 18 January 2026 02:07:00 -0500 (0:00:00.217) 0:22:22.028 ******** ok: [managed-node9] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Sunday 18 January 2026 02:07:01 -0500 (0:00:00.284) 0:22:22.313 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Sunday 18 January 2026 02:07:01 -0500 (0:00:00.678) 0:22:22.991 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.036221", "end": "2026-01-18 02:07:02.853556", "rc": 0, "start": "2026-01-18 02:07:02.817335" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Sunday 18 January 2026 02:07:03 -0500 (0:00:01.251) 0:22:24.242 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Sunday 18 January 2026 02:07:03 -0500 (0:00:00.566) 0:22:24.809 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Sunday 18 January 2026 02:07:04 -0500 (0:00:00.614) 0:22:25.423 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Sunday 18 January 2026 02:07:04 -0500 (0:00:00.454) 0:22:25.878 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Sunday 18 January 2026 02:07:05 -0500 (0:00:00.551) 0:22:26.430 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Sunday 18 January 2026 02:07:05 -0500 (0:00:00.513) 0:22:26.944 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Sunday 18 January 2026 02:07:06 -0500 (0:00:00.560) 0:22:27.504 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Sunday 18 January 2026 02:07:06 -0500 (0:00:00.228) 0:22:27.733 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Sunday 18 January 2026 02:07:07 -0500 (0:00:00.551) 0:22:28.285 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Sunday 18 January 2026 02:07:07 -0500 (0:00:00.269) 0:22:28.554 ******** changed: [managed-node9] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 6] ****************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:471 Sunday 18 January 2026 02:07:08 -0500 (0:00:01.204) 0:22:29.759 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node9 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Sunday 18 January 2026 02:07:09 -0500 (0:00:00.670) 0:22:30.430 ******** ok: [managed-node9] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Sunday 18 January 2026 02:07:09 -0500 (0:00:00.496) 0:22:30.927 ******** included: fedora.linux_system_roles.storage for managed-node9 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 18 January 2026 02:07:10 -0500 (0:00:00.448) 0:22:31.375 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 18 January 2026 02:07:10 -0500 (0:00:00.384) 0:22:31.760 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 18 January 2026 02:07:11 -0500 (0:00:00.701) 0:22:32.461 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 18 January 2026 02:07:12 -0500 (0:00:00.942) 0:22:33.404 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 18 January 2026 02:07:12 -0500 (0:00:00.280) 0:22:33.685 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 18 January 2026 02:07:12 -0500 (0:00:00.409) 0:22:34.094 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 18 January 2026 02:07:13 -0500 (0:00:00.303) 0:22:34.398 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 18 January 2026 02:07:13 -0500 (0:00:00.217) 0:22:34.615 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 18 January 2026 02:07:14 -0500 (0:00:00.862) 0:22:35.477 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 18 January 2026 02:07:16 -0500 (0:00:02.227) 0:22:37.704 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 18 January 2026 02:07:17 -0500 (0:00:00.775) 0:22:38.480 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 18 January 2026 02:07:18 -0500 (0:00:00.799) 0:22:39.279 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 18 January 2026 02:07:20 -0500 (0:00:02.777) 0:22:42.057 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 18 January 2026 02:07:21 -0500 (0:00:00.603) 0:22:42.661 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 18 January 2026 02:07:22 -0500 (0:00:00.554) 0:22:43.216 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 18 January 2026 02:07:22 -0500 (0:00:00.673) 0:22:43.889 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 18 January 2026 02:07:23 -0500 (0:00:00.605) 0:22:44.495 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 18 January 2026 02:07:25 -0500 (0:00:02.324) 0:22:46.819 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service": { "name": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 18 January 2026 02:07:28 -0500 (0:00:03.217) 0:22:50.037 ******** ok: [managed-node9] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 18 January 2026 02:07:29 -0500 (0:00:00.983) 0:22:51.020 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2d624bac9e\x2d643d\x2d439d\x2db06b\x2d2978f908f664.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "name": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "\"dev-mapper-foo\\\\x2dtest1.device\" \"system-systemd\\\\x2dcryptsetup.slice\" systemd-journald.socket systemd-udevd-kernel.socket cryptsetup-pre.target", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target \"blockdev@dev-mapper-luks\\\\x2d624bac9e\\\\x2d643d\\\\x2d439d\\\\x2db06b\\\\x2d2978f908f664.target\"", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-624bac9e-643d-439d-b06b-2978f908f664", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-624bac9e-643d-439d-b06b-2978f908f664 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-624bac9e-643d-439d-b06b-2978f908f664 /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-624bac9e-643d-439d-b06b-2978f908f664 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-624bac9e-643d-439d-b06b-2978f908f664 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d624bac9e\\\\x2d643d\\\\x2d439d\\\\x2db06b\\\\x2d2978f908f664.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Sun 2026-01-18 02:02:28 EST", "StateChangeTimestampMonotonic": "11326987203", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d624bac9e\\\\x2d643d\\\\x2d439d\\\\x2db06b\\\\x2d2978f908f664.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 18 January 2026 02:07:31 -0500 (0:00:01.631) 0:22:52.651 ******** fatal: [managed-node9]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Sunday 18 January 2026 02:07:34 -0500 (0:00:02.860) 0:22:55.512 ******** fatal: [managed-node9]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 18 January 2026 02:07:34 -0500 (0:00:00.323) 0:22:55.836 ******** changed: [managed-node9] => (item=systemd-cryptsetup@luks\x2d624bac9e\x2d643d\x2d439d\x2db06b\x2d2978f908f664.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "name": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454172672", "LimitMEMLOCKSoft": "454172672", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d624bac9e\\x2d643d\\x2d439d\\x2db06b\\x2d2978f908f664.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d624bac9e\\\\x2d643d\\\\x2d439d\\\\x2db06b\\\\x2d2978f908f664.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Sunday 18 January 2026 02:07:36 -0500 (0:00:01.722) 0:22:57.558 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Sunday 18 January 2026 02:07:36 -0500 (0:00:00.284) 0:22:57.842 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Sunday 18 January 2026 02:07:37 -0500 (0:00:00.392) 0:22:58.235 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Sunday 18 January 2026 02:07:37 -0500 (0:00:00.375) 0:22:58.610 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768720028.391179, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1768720028.391179, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1768720028.391179, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2131003110", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Sunday 18 January 2026 02:07:38 -0500 (0:00:01.446) 0:23:00.057 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume - 3] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:494 Sunday 18 January 2026 02:07:39 -0500 (0:00:00.302) 0:23:00.360 ******** included: fedora.linux_system_roles.storage for managed-node9 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 18 January 2026 02:07:40 -0500 (0:00:01.181) 0:23:01.541 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 18 January 2026 02:07:40 -0500 (0:00:00.319) 0:23:01.861 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 18 January 2026 02:07:41 -0500 (0:00:00.690) 0:23:02.551 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 18 January 2026 02:07:42 -0500 (0:00:00.950) 0:23:03.502 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 18 January 2026 02:07:42 -0500 (0:00:00.316) 0:23:03.819 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 18 January 2026 02:07:42 -0500 (0:00:00.349) 0:23:04.168 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 18 January 2026 02:07:43 -0500 (0:00:00.322) 0:23:04.491 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 18 January 2026 02:07:43 -0500 (0:00:00.243) 0:23:04.734 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 18 January 2026 02:07:44 -0500 (0:00:00.668) 0:23:05.403 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 18 January 2026 02:07:46 -0500 (0:00:02.216) 0:23:07.619 ******** ok: [managed-node9] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 18 January 2026 02:07:47 -0500 (0:00:00.681) 0:23:08.301 ******** ok: [managed-node9] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 18 January 2026 02:07:47 -0500 (0:00:00.836) 0:23:09.137 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 18 January 2026 02:07:50 -0500 (0:00:02.469) 0:23:11.607 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 18 January 2026 02:07:50 -0500 (0:00:00.535) 0:23:12.143 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 18 January 2026 02:07:51 -0500 (0:00:00.518) 0:23:12.661 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 18 January 2026 02:07:52 -0500 (0:00:00.622) 0:23:13.284 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 18 January 2026 02:07:52 -0500 (0:00:00.548) 0:23:13.832 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 18 January 2026 02:07:54 -0500 (0:00:02.358) 0:23:16.190 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 18 January 2026 02:07:58 -0500 (0:00:03.143) 0:23:19.333 ******** ok: [managed-node9] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 18 January 2026 02:07:59 -0500 (0:00:01.127) 0:23:20.461 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 18 January 2026 02:07:59 -0500 (0:00:00.205) 0:23:20.666 ******** changed: [managed-node9] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 18 January 2026 02:08:12 -0500 (0:00:12.810) 0:23:33.477 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 18 January 2026 02:08:12 -0500 (0:00:00.565) 0:23:34.043 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719928.0787094, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "6192220afe0ca030c8bd5d193d7664f061f0b243", "ctime": 1768719928.0757093, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 574619848, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1768719928.0757093, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1458, "uid": 0, "version": "1014542963", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 18 January 2026 02:08:14 -0500 (0:00:01.366) 0:23:35.409 ******** ok: [managed-node9] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 18 January 2026 02:08:15 -0500 (0:00:01.481) 0:23:36.890 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 18 January 2026 02:08:15 -0500 (0:00:00.210) 0:23:37.101 ******** ok: [managed-node9] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 18 January 2026 02:08:16 -0500 (0:00:00.458) 0:23:37.560 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 18 January 2026 02:08:16 -0500 (0:00:00.329) 0:23:37.889 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 18 January 2026 02:08:16 -0500 (0:00:00.311) 0:23:38.201 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node9] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 18 January 2026 02:08:18 -0500 (0:00:01.840) 0:23:40.041 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 18 January 2026 02:08:20 -0500 (0:00:02.031) 0:23:42.072 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node9] => (item={'src': '/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 18 January 2026 02:08:22 -0500 (0:00:01.890) 0:23:43.962 ******** skipping: [managed-node9] => (item={'src': '/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 18 January 2026 02:08:23 -0500 (0:00:00.713) 0:23:44.676 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 18 January 2026 02:08:25 -0500 (0:00:01.923) 0:23:46.601 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768719942.4917767, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1768719934.3777387, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 20971726, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1768719934.378321, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1205320921", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 18 January 2026 02:08:26 -0500 (0:00:01.306) 0:23:47.908 ******** changed: [managed-node9] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 18 January 2026 02:08:28 -0500 (0:00:01.777) 0:23:49.686 ******** ok: [managed-node9] TASK [Verify role results - 10] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:510 Sunday 18 January 2026 02:08:30 -0500 (0:00:02.046) 0:23:51.732 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node9 TASK [Print out pool information] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Sunday 18 January 2026 02:08:31 -0500 (0:00:00.863) 0:23:52.595 ******** ok: [managed-node9] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Sunday 18 January 2026 02:08:35 -0500 (0:00:03.782) 0:23:56.377 ******** skipping: [managed-node9] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Sunday 18 January 2026 02:08:35 -0500 (0:00:00.572) 0:23:56.950 ******** ok: [managed-node9] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "f3c8f6a4-386b-4395-85b7-8951ce5c6959" }, "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "size": "4G", "type": "crypt", "uuid": "04049eb7-2091-4779-8754-773daf5298ad" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "nF8Jg7-ZUfV-jV9Y-MYu6-Z3xj-fKdT-Gu9qOl" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "a072e5ad-b1ec-4900-abeb-5d4679ecfcd5" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Sunday 18 January 2026 02:08:36 -0500 (0:00:01.231) 0:23:58.181 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002997", "end": "2026-01-18 02:08:37.918985", "rc": 0, "start": "2026-01-18 02:08:37.915988" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Mon Jan 5 14:46:37 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=a072e5ad-b1ec-4900-abeb-5d4679ecfcd5 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Sunday 18 January 2026 02:08:38 -0500 (0:00:01.124) 0:23:59.306 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003045", "end": "2026-01-18 02:08:39.116830", "failed_when_result": false, "rc": 0, "start": "2026-01-18 02:08:39.113785" } STDOUT: luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Sunday 18 January 2026 02:08:39 -0500 (0:00:01.163) 0:24:00.470 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node9 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Sunday 18 January 2026 02:08:40 -0500 (0:00:00.894) 0:24:01.364 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Sunday 18 January 2026 02:08:40 -0500 (0:00:00.289) 0:24:01.654 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.026410", "end": "2026-01-18 02:08:41.589518", "rc": 0, "start": "2026-01-18 02:08:41.563108" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Sunday 18 January 2026 02:08:41 -0500 (0:00:01.330) 0:24:02.984 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Sunday 18 January 2026 02:08:42 -0500 (0:00:00.448) 0:24:03.433 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node9 => (item=members) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node9 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Sunday 18 January 2026 02:08:42 -0500 (0:00:00.733) 0:24:04.167 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Sunday 18 January 2026 02:08:43 -0500 (0:00:00.596) 0:24:04.763 ******** ok: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Sunday 18 January 2026 02:08:44 -0500 (0:00:01.233) 0:24:05.997 ******** ok: [managed-node9] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Sunday 18 January 2026 02:08:45 -0500 (0:00:00.616) 0:24:06.614 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Sunday 18 January 2026 02:08:46 -0500 (0:00:00.747) 0:24:07.361 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Sunday 18 January 2026 02:08:46 -0500 (0:00:00.838) 0:24:08.200 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Sunday 18 January 2026 02:08:47 -0500 (0:00:00.336) 0:24:08.536 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Sunday 18 January 2026 02:08:48 -0500 (0:00:00.731) 0:24:09.267 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Sunday 18 January 2026 02:08:48 -0500 (0:00:00.379) 0:24:09.647 ******** ok: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Sunday 18 January 2026 02:08:48 -0500 (0:00:00.519) 0:24:10.166 ******** ok: [managed-node9] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:125623): WARNING **: 02:08:50.039: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.1 1 Jul 2025 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.176 originally 10.31.45.176 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.45.176 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.45.176 originally 10.31.45.176 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/bb59fe80f9' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.45.176 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Sunday 18 January 2026 02:08:50 -0500 (0:00:01.326) 0:24:11.493 ******** skipping: [managed-node9] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "storage_test_pool.grow_to_fill | bool", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Sunday 18 January 2026 02:08:50 -0500 (0:00:00.708) 0:24:12.201 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node9 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Sunday 18 January 2026 02:08:51 -0500 (0:00:00.604) 0:24:12.806 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Sunday 18 January 2026 02:08:51 -0500 (0:00:00.207) 0:24:13.013 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Sunday 18 January 2026 02:08:52 -0500 (0:00:00.252) 0:24:13.266 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Sunday 18 January 2026 02:08:52 -0500 (0:00:00.237) 0:24:13.503 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Sunday 18 January 2026 02:08:52 -0500 (0:00:00.263) 0:24:13.767 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Sunday 18 January 2026 02:08:52 -0500 (0:00:00.162) 0:24:13.929 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Sunday 18 January 2026 02:08:52 -0500 (0:00:00.225) 0:24:14.155 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Sunday 18 January 2026 02:08:53 -0500 (0:00:00.283) 0:24:14.439 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Sunday 18 January 2026 02:08:53 -0500 (0:00:00.175) 0:24:14.615 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Sunday 18 January 2026 02:08:53 -0500 (0:00:00.273) 0:24:14.889 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Sunday 18 January 2026 02:08:53 -0500 (0:00:00.207) 0:24:15.112 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Sunday 18 January 2026 02:08:54 -0500 (0:00:00.218) 0:24:15.330 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node9 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Sunday 18 January 2026 02:08:54 -0500 (0:00:00.566) 0:24:15.897 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node9 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Sunday 18 January 2026 02:08:55 -0500 (0:00:00.539) 0:24:16.437 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Sunday 18 January 2026 02:08:55 -0500 (0:00:00.272) 0:24:16.710 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Sunday 18 January 2026 02:08:55 -0500 (0:00:00.394) 0:24:17.105 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Sunday 18 January 2026 02:08:56 -0500 (0:00:00.380) 0:24:17.485 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Sunday 18 January 2026 02:08:56 -0500 (0:00:00.280) 0:24:17.766 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Sunday 18 January 2026 02:08:56 -0500 (0:00:00.336) 0:24:18.102 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Sunday 18 January 2026 02:08:57 -0500 (0:00:00.357) 0:24:18.459 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Sunday 18 January 2026 02:08:57 -0500 (0:00:00.318) 0:24:18.777 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node9 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Sunday 18 January 2026 02:08:58 -0500 (0:00:00.564) 0:24:19.342 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node9 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Sunday 18 January 2026 02:08:58 -0500 (0:00:00.413) 0:24:19.755 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Sunday 18 January 2026 02:08:58 -0500 (0:00:00.207) 0:24:19.963 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Sunday 18 January 2026 02:08:59 -0500 (0:00:00.294) 0:24:20.258 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Sunday 18 January 2026 02:08:59 -0500 (0:00:00.259) 0:24:20.517 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Sunday 18 January 2026 02:08:59 -0500 (0:00:00.239) 0:24:20.757 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node9 TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Sunday 18 January 2026 02:09:00 -0500 (0:00:00.683) 0:24:21.440 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Sunday 18 January 2026 02:09:00 -0500 (0:00:00.614) 0:24:22.055 ******** skipping: [managed-node9] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Sunday 18 January 2026 02:09:01 -0500 (0:00:00.275) 0:24:22.330 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node9 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Sunday 18 January 2026 02:09:01 -0500 (0:00:00.413) 0:24:22.743 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Sunday 18 January 2026 02:09:02 -0500 (0:00:00.869) 0:24:23.613 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Sunday 18 January 2026 02:09:03 -0500 (0:00:00.700) 0:24:24.313 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Sunday 18 January 2026 02:09:03 -0500 (0:00:00.615) 0:24:24.929 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Sunday 18 January 2026 02:09:04 -0500 (0:00:00.545) 0:24:25.474 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Sunday 18 January 2026 02:09:04 -0500 (0:00:00.504) 0:24:25.978 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Sunday 18 January 2026 02:09:05 -0500 (0:00:00.272) 0:24:26.251 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Sunday 18 January 2026 02:09:05 -0500 (0:00:00.231) 0:24:26.483 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node9 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Sunday 18 January 2026 02:09:05 -0500 (0:00:00.638) 0:24:27.121 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node9 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Sunday 18 January 2026 02:09:06 -0500 (0:00:00.508) 0:24:27.629 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Sunday 18 January 2026 02:09:06 -0500 (0:00:00.212) 0:24:27.841 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Sunday 18 January 2026 02:09:06 -0500 (0:00:00.268) 0:24:28.110 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Sunday 18 January 2026 02:09:07 -0500 (0:00:00.247) 0:24:28.357 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Sunday 18 January 2026 02:09:07 -0500 (0:00:00.263) 0:24:28.621 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Sunday 18 January 2026 02:09:07 -0500 (0:00:00.292) 0:24:28.913 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Sunday 18 January 2026 02:09:07 -0500 (0:00:00.202) 0:24:29.116 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Sunday 18 January 2026 02:09:08 -0500 (0:00:00.265) 0:24:29.382 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node9 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Sunday 18 January 2026 02:09:08 -0500 (0:00:00.826) 0:24:30.208 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Sunday 18 January 2026 02:09:09 -0500 (0:00:00.175) 0:24:30.383 ******** skipping: [managed-node9] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Sunday 18 January 2026 02:09:09 -0500 (0:00:00.244) 0:24:30.628 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Sunday 18 January 2026 02:09:09 -0500 (0:00:00.185) 0:24:30.814 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Sunday 18 January 2026 02:09:09 -0500 (0:00:00.317) 0:24:31.131 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Sunday 18 January 2026 02:09:10 -0500 (0:00:00.250) 0:24:31.381 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Sunday 18 January 2026 02:09:10 -0500 (0:00:00.261) 0:24:31.643 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Sunday 18 January 2026 02:09:10 -0500 (0:00:00.280) 0:24:31.923 ******** ok: [managed-node9] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Sunday 18 January 2026 02:09:10 -0500 (0:00:00.239) 0:24:32.163 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node9 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Sunday 18 January 2026 02:09:11 -0500 (0:00:00.434) 0:24:32.598 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Sunday 18 January 2026 02:09:12 -0500 (0:00:00.619) 0:24:33.218 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node9 => (item=mount) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node9 => (item=fstab) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node9 => (item=fs) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node9 => (item=device) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node9 => (item=encryption) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node9 => (item=md) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node9 => (item=size) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node9 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Sunday 18 January 2026 02:09:13 -0500 (0:00:01.605) 0:24:34.823 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Sunday 18 January 2026 02:09:14 -0500 (0:00:00.438) 0:24:35.261 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Sunday 18 January 2026 02:09:14 -0500 (0:00:00.581) 0:24:35.843 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Sunday 18 January 2026 02:09:15 -0500 (0:00:00.612) 0:24:36.455 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Sunday 18 January 2026 02:09:15 -0500 (0:00:00.276) 0:24:36.732 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Sunday 18 January 2026 02:09:16 -0500 (0:00:00.545) 0:24:37.277 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Sunday 18 January 2026 02:09:16 -0500 (0:00:00.706) 0:24:37.984 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Sunday 18 January 2026 02:09:17 -0500 (0:00:00.704) 0:24:38.688 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Sunday 18 January 2026 02:09:17 -0500 (0:00:00.205) 0:24:38.894 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Sunday 18 January 2026 02:09:17 -0500 (0:00:00.257) 0:24:39.151 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Sunday 18 January 2026 02:09:18 -0500 (0:00:00.205) 0:24:39.356 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Sunday 18 January 2026 02:09:18 -0500 (0:00:00.249) 0:24:39.606 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Sunday 18 January 2026 02:09:19 -0500 (0:00:01.034) 0:24:40.640 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Sunday 18 January 2026 02:09:20 -0500 (0:00:00.593) 0:24:41.234 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Sunday 18 January 2026 02:09:20 -0500 (0:00:00.599) 0:24:41.833 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Sunday 18 January 2026 02:09:21 -0500 (0:00:00.515) 0:24:42.348 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Sunday 18 January 2026 02:09:21 -0500 (0:00:00.811) 0:24:43.160 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Sunday 18 January 2026 02:09:22 -0500 (0:00:00.285) 0:24:43.445 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Sunday 18 January 2026 02:09:22 -0500 (0:00:00.656) 0:24:44.102 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Sunday 18 January 2026 02:09:23 -0500 (0:00:00.656) 0:24:44.759 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768720091.470475, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1768720091.470475, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2064, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1768720091.470475, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Sunday 18 January 2026 02:09:24 -0500 (0:00:01.286) 0:24:46.045 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Sunday 18 January 2026 02:09:25 -0500 (0:00:00.285) 0:24:46.331 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Sunday 18 January 2026 02:09:25 -0500 (0:00:00.219) 0:24:46.550 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Sunday 18 January 2026 02:09:25 -0500 (0:00:00.294) 0:24:46.845 ******** ok: [managed-node9] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Sunday 18 January 2026 02:09:25 -0500 (0:00:00.165) 0:24:47.011 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Sunday 18 January 2026 02:09:25 -0500 (0:00:00.126) 0:24:47.137 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Sunday 18 January 2026 02:09:26 -0500 (0:00:00.284) 0:24:47.422 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768720091.9534774, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1768720091.9534774, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2140, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1768720091.9534774, "nlink": 1, "path": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Sunday 18 January 2026 02:09:27 -0500 (0:00:01.141) 0:24:48.563 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Sunday 18 January 2026 02:09:29 -0500 (0:00:02.299) 0:24:50.863 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.006996", "end": "2026-01-18 02:09:30.525860", "rc": 0, "start": "2026-01-18 02:09:30.518864" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: f3c8f6a4-386b-4395-85b7-8951ce5c6959 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 665346 Threads: 2 Salt: 91 05 13 62 8c 19 c3 f9 88 0e c3 84 c4 39 d8 7f 92 3d c4 ad a4 2e 69 dd 80 20 1c 5e 10 72 ff b3 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 133474 Salt: 3d c8 4a f1 3f a7 2e f0 78 07 4f d9 2d f1 e4 fb 4b 80 c9 30 55 87 e6 e9 44 dd 82 47 34 b6 da 9f Digest: 9c e8 92 dc 04 5b 85 7d 1f ca 3c 3b 47 e2 3c 2e 13 79 3e ad 76 54 1d 1c 29 d5 7b ca 81 71 d5 4a TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Sunday 18 January 2026 02:09:30 -0500 (0:00:01.004) 0:24:51.868 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Sunday 18 January 2026 02:09:31 -0500 (0:00:00.786) 0:24:52.654 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Sunday 18 January 2026 02:09:32 -0500 (0:00:00.618) 0:24:53.273 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Sunday 18 January 2026 02:09:32 -0500 (0:00:00.389) 0:24:53.662 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Sunday 18 January 2026 02:09:32 -0500 (0:00:00.301) 0:24:53.963 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.encryption_luks_version is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Sunday 18 January 2026 02:09:33 -0500 (0:00:00.293) 0:24:54.257 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Sunday 18 January 2026 02:09:33 -0500 (0:00:00.298) 0:24:54.555 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Sunday 18 January 2026 02:09:33 -0500 (0:00:00.337) 0:24:54.893 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Sunday 18 January 2026 02:09:34 -0500 (0:00:00.898) 0:24:55.791 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Sunday 18 January 2026 02:09:35 -0500 (0:00:00.782) 0:24:56.573 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Sunday 18 January 2026 02:09:36 -0500 (0:00:00.690) 0:24:57.264 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Sunday 18 January 2026 02:09:36 -0500 (0:00:00.679) 0:24:57.944 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Sunday 18 January 2026 02:09:37 -0500 (0:00:00.759) 0:24:58.703 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Sunday 18 January 2026 02:09:37 -0500 (0:00:00.278) 0:24:58.981 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Sunday 18 January 2026 02:09:37 -0500 (0:00:00.211) 0:24:59.193 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Sunday 18 January 2026 02:09:38 -0500 (0:00:00.187) 0:24:59.380 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Sunday 18 January 2026 02:09:38 -0500 (0:00:00.258) 0:24:59.639 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Sunday 18 January 2026 02:09:38 -0500 (0:00:00.216) 0:24:59.856 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Sunday 18 January 2026 02:09:38 -0500 (0:00:00.210) 0:25:00.066 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Sunday 18 January 2026 02:09:39 -0500 (0:00:00.196) 0:25:00.263 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Sunday 18 January 2026 02:09:39 -0500 (0:00:00.253) 0:25:00.516 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Sunday 18 January 2026 02:09:39 -0500 (0:00:00.205) 0:25:00.722 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Sunday 18 January 2026 02:09:39 -0500 (0:00:00.197) 0:25:00.920 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Sunday 18 January 2026 02:09:39 -0500 (0:00:00.176) 0:25:01.096 ******** ok: [managed-node9] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Sunday 18 January 2026 02:09:41 -0500 (0:00:01.731) 0:25:02.827 ******** ok: [managed-node9] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Sunday 18 January 2026 02:09:43 -0500 (0:00:01.826) 0:25:04.654 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Sunday 18 January 2026 02:09:44 -0500 (0:00:00.774) 0:25:05.428 ******** ok: [managed-node9] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Sunday 18 January 2026 02:09:44 -0500 (0:00:00.245) 0:25:05.673 ******** ok: [managed-node9] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Sunday 18 January 2026 02:09:46 -0500 (0:00:01.709) 0:25:07.383 ******** skipping: [managed-node9] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Sunday 18 January 2026 02:09:46 -0500 (0:00:00.674) 0:25:08.058 ******** skipping: [managed-node9] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Sunday 18 January 2026 02:09:47 -0500 (0:00:00.705) 0:25:08.764 ******** skipping: [managed-node9] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Sunday 18 January 2026 02:09:48 -0500 (0:00:00.546) 0:25:09.311 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Sunday 18 January 2026 02:09:48 -0500 (0:00:00.587) 0:25:09.899 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Sunday 18 January 2026 02:09:49 -0500 (0:00:00.646) 0:25:10.545 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Sunday 18 January 2026 02:09:50 -0500 (0:00:00.679) 0:25:11.224 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Sunday 18 January 2026 02:09:50 -0500 (0:00:00.635) 0:25:11.860 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Sunday 18 January 2026 02:09:51 -0500 (0:00:00.683) 0:25:12.543 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Sunday 18 January 2026 02:09:52 -0500 (0:00:00.732) 0:25:13.293 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Sunday 18 January 2026 02:09:52 -0500 (0:00:00.485) 0:25:13.779 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Sunday 18 January 2026 02:09:53 -0500 (0:00:00.711) 0:25:14.529 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Sunday 18 January 2026 02:09:54 -0500 (0:00:00.713) 0:25:15.243 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Sunday 18 January 2026 02:09:54 -0500 (0:00:00.724) 0:25:15.968 ******** skipping: [managed-node9] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Sunday 18 January 2026 02:09:55 -0500 (0:00:00.592) 0:25:16.561 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Sunday 18 January 2026 02:09:56 -0500 (0:00:00.741) 0:25:17.303 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Sunday 18 January 2026 02:09:56 -0500 (0:00:00.626) 0:25:17.929 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Sunday 18 January 2026 02:09:57 -0500 (0:00:00.562) 0:25:18.492 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Sunday 18 January 2026 02:09:57 -0500 (0:00:00.592) 0:25:19.084 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Sunday 18 January 2026 02:09:58 -0500 (0:00:00.709) 0:25:19.793 ******** ok: [managed-node9] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Sunday 18 January 2026 02:09:58 -0500 (0:00:00.254) 0:25:20.048 ******** ok: [managed-node9] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Sunday 18 January 2026 02:09:59 -0500 (0:00:00.311) 0:25:20.360 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Sunday 18 January 2026 02:09:59 -0500 (0:00:00.669) 0:25:21.029 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.032832", "end": "2026-01-18 02:10:00.981620", "rc": 0, "start": "2026-01-18 02:10:00.948788" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Sunday 18 January 2026 02:10:01 -0500 (0:00:01.326) 0:25:22.356 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Sunday 18 January 2026 02:10:01 -0500 (0:00:00.661) 0:25:23.017 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Sunday 18 January 2026 02:10:02 -0500 (0:00:00.644) 0:25:23.662 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Sunday 18 January 2026 02:10:03 -0500 (0:00:00.632) 0:25:24.295 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Sunday 18 January 2026 02:10:03 -0500 (0:00:00.657) 0:25:24.952 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Sunday 18 January 2026 02:10:04 -0500 (0:00:00.581) 0:25:25.533 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Sunday 18 January 2026 02:10:04 -0500 (0:00:00.670) 0:25:26.204 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Sunday 18 January 2026 02:10:05 -0500 (0:00:00.244) 0:25:26.449 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Sunday 18 January 2026 02:10:05 -0500 (0:00:00.598) 0:25:27.048 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:513 Sunday 18 January 2026 02:10:05 -0500 (0:00:00.147) 0:25:27.196 ******** included: fedora.linux_system_roles.storage for managed-node9 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Sunday 18 January 2026 02:10:07 -0500 (0:00:01.234) 0:25:28.430 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Sunday 18 January 2026 02:10:07 -0500 (0:00:00.340) 0:25:28.771 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Sunday 18 January 2026 02:10:08 -0500 (0:00:00.578) 0:25:29.350 ******** skipping: [managed-node9] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node9] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node9] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Sunday 18 January 2026 02:10:08 -0500 (0:00:00.821) 0:25:30.171 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Sunday 18 January 2026 02:10:09 -0500 (0:00:00.358) 0:25:30.530 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Sunday 18 January 2026 02:10:11 -0500 (0:00:01.782) 0:25:32.312 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sunday 18 January 2026 02:10:11 -0500 (0:00:00.310) 0:25:32.622 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Sunday 18 January 2026 02:10:11 -0500 (0:00:00.251) 0:25:32.874 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Sunday 18 January 2026 02:10:12 -0500 (0:00:00.739) 0:25:33.614 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kmod-kvdo libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python3-blivet stratis-cli stratisd vdo xfsprogs TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Sunday 18 January 2026 02:10:14 -0500 (0:00:02.307) 0:25:35.922 ******** ok: [managed-node9] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Sunday 18 January 2026 02:10:15 -0500 (0:00:00.715) 0:25:36.638 ******** ok: [managed-node9] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Sunday 18 January 2026 02:10:16 -0500 (0:00:00.697) 0:25:37.335 ******** ok: [managed-node9] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Sunday 18 January 2026 02:10:18 -0500 (0:00:02.822) 0:25:40.157 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node9 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Sunday 18 January 2026 02:10:19 -0500 (0:00:00.669) 0:25:40.826 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Sunday 18 January 2026 02:10:20 -0500 (0:00:00.560) 0:25:41.387 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Sunday 18 January 2026 02:10:20 -0500 (0:00:00.614) 0:25:42.001 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Sunday 18 January 2026 02:10:21 -0500 (0:00:00.559) 0:25:42.560 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Sunday 18 January 2026 02:10:23 -0500 (0:00:02.335) 0:25:44.896 ******** ok: [managed-node9] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Sunday 18 January 2026 02:10:26 -0500 (0:00:02.975) 0:25:47.871 ******** ok: [managed-node9] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Sunday 18 January 2026 02:10:27 -0500 (0:00:00.769) 0:25:48.652 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Sunday 18 January 2026 02:10:27 -0500 (0:00:00.212) 0:25:48.864 ******** changed: [managed-node9] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=nF8Jg7-ZUfV-jV9Y-MYu6-Z3xj-fKdT-Gu9qOl", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Sunday 18 January 2026 02:10:31 -0500 (0:00:03.814) 0:25:52.679 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Sunday 18 January 2026 02:10:31 -0500 (0:00:00.511) 0:25:53.190 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768720102.541527, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "8c9c1bb1273d9c724e36b9ff7476b40ed37d12a9", "ctime": 1768720102.538527, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 574619848, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1768720102.538527, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1490, "uid": 0, "version": "1014542963", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Sunday 18 January 2026 02:10:33 -0500 (0:00:01.260) 0:25:54.451 ******** ok: [managed-node9] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Sunday 18 January 2026 02:10:34 -0500 (0:00:01.257) 0:25:55.709 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Sunday 18 January 2026 02:10:34 -0500 (0:00:00.151) 0:25:55.860 ******** ok: [managed-node9] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=nF8Jg7-ZUfV-jV9Y-MYu6-Z3xj-fKdT-Gu9qOl", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Sunday 18 January 2026 02:10:34 -0500 (0:00:00.339) 0:25:56.199 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Sunday 18 January 2026 02:10:35 -0500 (0:00:00.291) 0:25:56.491 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=nF8Jg7-ZUfV-jV9Y-MYu6-Z3xj-fKdT-Gu9qOl", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Sunday 18 January 2026 02:10:35 -0500 (0:00:00.341) 0:25:56.833 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node9] => (item={'src': '/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Sunday 18 January 2026 02:10:37 -0500 (0:00:01.736) 0:25:58.569 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Sunday 18 January 2026 02:10:39 -0500 (0:00:01.946) 0:26:00.516 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Sunday 18 January 2026 02:10:39 -0500 (0:00:00.596) 0:26:01.113 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Sunday 18 January 2026 02:10:40 -0500 (0:00:00.632) 0:26:01.745 ******** ok: [managed-node9] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Sunday 18 January 2026 02:10:42 -0500 (0:00:01.891) 0:26:03.637 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768720119.1156046, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "6485cdac44f19fc64b490e0658be6a73f44e0b0f", "ctime": 1768720108.332554, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 314573065, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1768720108.3334842, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "3083875649", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Sunday 18 January 2026 02:10:43 -0500 (0:00:01.331) 0:26:04.968 ******** changed: [managed-node9] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-f3c8f6a4-386b-4395-85b7-8951ce5c6959", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Sunday 18 January 2026 02:10:45 -0500 (0:00:01.580) 0:26:06.549 ******** ok: [managed-node9] TASK [Verify role results - 11] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:523 Sunday 18 January 2026 02:10:47 -0500 (0:00:02.033) 0:26:08.582 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node9 TASK [Print out pool information] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Sunday 18 January 2026 02:10:48 -0500 (0:00:00.909) 0:26:09.492 ******** skipping: [managed-node9] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Sunday 18 January 2026 02:10:48 -0500 (0:00:00.617) 0:26:10.109 ******** ok: [managed-node9] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=nF8Jg7-ZUfV-jV9Y-MYu6-Z3xj-fKdT-Gu9qOl", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Sunday 18 January 2026 02:10:49 -0500 (0:00:00.721) 0:26:10.831 ******** ok: [managed-node9] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "a072e5ad-b1ec-4900-abeb-5d4679ecfcd5" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Sunday 18 January 2026 02:10:50 -0500 (0:00:01.258) 0:26:12.089 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003030", "end": "2026-01-18 02:10:51.858775", "rc": 0, "start": "2026-01-18 02:10:51.855745" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Mon Jan 5 14:46:37 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=a072e5ad-b1ec-4900-abeb-5d4679ecfcd5 / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Sunday 18 January 2026 02:10:52 -0500 (0:00:01.121) 0:26:13.224 ******** ok: [managed-node9] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002961", "end": "2026-01-18 02:10:53.038741", "failed_when_result": false, "rc": 0, "start": "2026-01-18 02:10:53.035780" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Sunday 18 January 2026 02:10:53 -0500 (0:00:01.188) 0:26:14.412 ******** skipping: [managed-node9] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Sunday 18 January 2026 02:10:53 -0500 (0:00:00.505) 0:26:14.917 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node9 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'lvmpv', 'mount_options': 'defaults', 'mount_point': None, 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'absent', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/sda', '_raw_device': '/dev/sda', '_mount_id': 'UUID=nF8Jg7-ZUfV-jV9Y-MYu6-Z3xj-fKdT-Gu9qOl'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Sunday 18 January 2026 02:10:54 -0500 (0:00:00.765) 0:26:15.683 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Sunday 18 January 2026 02:10:55 -0500 (0:00:00.684) 0:26:16.368 ******** included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node9 => (item=mount) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node9 => (item=fstab) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node9 => (item=fs) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node9 => (item=device) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node9 => (item=encryption) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node9 => (item=md) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node9 => (item=size) included: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node9 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Sunday 18 January 2026 02:10:56 -0500 (0:00:01.834) 0:26:18.202 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Sunday 18 January 2026 02:10:57 -0500 (0:00:00.316) 0:26:18.519 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Sunday 18 January 2026 02:10:57 -0500 (0:00:00.589) 0:26:19.145 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Sunday 18 January 2026 02:10:58 -0500 (0:00:00.592) 0:26:19.738 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Sunday 18 January 2026 02:10:58 -0500 (0:00:00.218) 0:26:19.957 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Sunday 18 January 2026 02:10:59 -0500 (0:00:00.285) 0:26:20.242 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Sunday 18 January 2026 02:10:59 -0500 (0:00:00.196) 0:26:20.438 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Sunday 18 January 2026 02:10:59 -0500 (0:00:00.244) 0:26:20.682 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Sunday 18 January 2026 02:10:59 -0500 (0:00:00.203) 0:26:20.886 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Sunday 18 January 2026 02:10:59 -0500 (0:00:00.228) 0:26:21.115 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Sunday 18 January 2026 02:11:00 -0500 (0:00:00.183) 0:26:21.298 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Sunday 18 January 2026 02:11:00 -0500 (0:00:00.220) 0:26:21.518 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Sunday 18 January 2026 02:11:01 -0500 (0:00:01.071) 0:26:22.590 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Sunday 18 January 2026 02:11:01 -0500 (0:00:00.486) 0:26:23.076 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Sunday 18 January 2026 02:11:02 -0500 (0:00:00.608) 0:26:23.685 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Sunday 18 January 2026 02:11:03 -0500 (0:00:00.527) 0:26:24.212 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Sunday 18 January 2026 02:11:03 -0500 (0:00:00.655) 0:26:24.868 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Sunday 18 January 2026 02:11:04 -0500 (0:00:00.367) 0:26:25.250 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Sunday 18 January 2026 02:11:04 -0500 (0:00:00.501) 0:26:25.752 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Sunday 18 January 2026 02:11:05 -0500 (0:00:00.519) 0:26:26.272 ******** ok: [managed-node9] => { "changed": false, "stat": { "atime": 1768720231.1681304, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1768720231.1681304, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 447, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1768720231.1681304, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Sunday 18 January 2026 02:11:06 -0500 (0:00:01.093) 0:26:27.366 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Sunday 18 January 2026 02:11:06 -0500 (0:00:00.308) 0:26:27.698 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Sunday 18 January 2026 02:11:06 -0500 (0:00:00.169) 0:26:27.868 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Sunday 18 January 2026 02:11:06 -0500 (0:00:00.191) 0:26:28.059 ******** ok: [managed-node9] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Sunday 18 January 2026 02:11:07 -0500 (0:00:00.293) 0:26:28.352 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Sunday 18 January 2026 02:11:07 -0500 (0:00:00.228) 0:26:28.590 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Sunday 18 January 2026 02:11:07 -0500 (0:00:00.221) 0:26:28.812 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Sunday 18 January 2026 02:11:07 -0500 (0:00:00.298) 0:26:29.111 ******** ok: [managed-node9] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Sunday 18 January 2026 02:11:10 -0500 (0:00:02.307) 0:26:31.418 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Sunday 18 January 2026 02:11:10 -0500 (0:00:00.319) 0:26:31.738 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Sunday 18 January 2026 02:11:10 -0500 (0:00:00.183) 0:26:31.922 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Sunday 18 January 2026 02:11:10 -0500 (0:00:00.237) 0:26:32.159 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Sunday 18 January 2026 02:11:11 -0500 (0:00:00.225) 0:26:32.385 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Sunday 18 January 2026 02:11:11 -0500 (0:00:00.272) 0:26:32.657 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Sunday 18 January 2026 02:11:11 -0500 (0:00:00.204) 0:26:32.862 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Sunday 18 January 2026 02:11:11 -0500 (0:00:00.204) 0:26:33.067 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Sunday 18 January 2026 02:11:12 -0500 (0:00:00.202) 0:26:33.270 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Sunday 18 January 2026 02:11:12 -0500 (0:00:00.612) 0:26:33.882 ******** ok: [managed-node9] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Sunday 18 January 2026 02:11:13 -0500 (0:00:00.601) 0:26:34.484 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Sunday 18 January 2026 02:11:13 -0500 (0:00:00.611) 0:26:35.096 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Sunday 18 January 2026 02:11:14 -0500 (0:00:00.612) 0:26:35.708 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Sunday 18 January 2026 02:11:15 -0500 (0:00:00.645) 0:26:36.353 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Sunday 18 January 2026 02:11:15 -0500 (0:00:00.284) 0:26:36.637 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Sunday 18 January 2026 02:11:15 -0500 (0:00:00.248) 0:26:36.886 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Sunday 18 January 2026 02:11:15 -0500 (0:00:00.193) 0:26:37.080 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Sunday 18 January 2026 02:11:16 -0500 (0:00:00.224) 0:26:37.304 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Sunday 18 January 2026 02:11:16 -0500 (0:00:00.201) 0:26:37.506 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Sunday 18 January 2026 02:11:16 -0500 (0:00:00.242) 0:26:37.749 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Sunday 18 January 2026 02:11:16 -0500 (0:00:00.258) 0:26:38.007 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Sunday 18 January 2026 02:11:16 -0500 (0:00:00.181) 0:26:38.189 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Sunday 18 January 2026 02:11:17 -0500 (0:00:00.217) 0:26:38.407 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Sunday 18 January 2026 02:11:17 -0500 (0:00:00.215) 0:26:38.623 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Sunday 18 January 2026 02:11:17 -0500 (0:00:00.182) 0:26:38.806 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Sunday 18 January 2026 02:11:18 -0500 (0:00:00.649) 0:26:39.455 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Sunday 18 January 2026 02:11:18 -0500 (0:00:00.479) 0:26:39.934 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Sunday 18 January 2026 02:11:19 -0500 (0:00:00.402) 0:26:40.337 ******** ok: [managed-node9] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Sunday 18 January 2026 02:11:19 -0500 (0:00:00.236) 0:26:40.573 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Sunday 18 January 2026 02:11:19 -0500 (0:00:00.561) 0:26:41.135 ******** skipping: [managed-node9] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Sunday 18 January 2026 02:11:20 -0500 (0:00:00.670) 0:26:41.806 ******** skipping: [managed-node9] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Sunday 18 January 2026 02:11:21 -0500 (0:00:00.449) 0:26:42.255 ******** skipping: [managed-node9] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Sunday 18 January 2026 02:11:21 -0500 (0:00:00.484) 0:26:42.740 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Sunday 18 January 2026 02:11:22 -0500 (0:00:00.522) 0:26:43.263 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Sunday 18 January 2026 02:11:22 -0500 (0:00:00.228) 0:26:43.492 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Sunday 18 January 2026 02:11:22 -0500 (0:00:00.333) 0:26:43.825 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Sunday 18 January 2026 02:11:22 -0500 (0:00:00.228) 0:26:44.053 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Sunday 18 January 2026 02:11:23 -0500 (0:00:00.339) 0:26:44.392 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Sunday 18 January 2026 02:11:23 -0500 (0:00:00.306) 0:26:44.699 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Sunday 18 January 2026 02:11:23 -0500 (0:00:00.300) 0:26:45.000 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Sunday 18 January 2026 02:11:24 -0500 (0:00:00.360) 0:26:45.360 ******** skipping: [managed-node9] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Sunday 18 January 2026 02:11:24 -0500 (0:00:00.319) 0:26:45.680 ******** skipping: [managed-node9] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Sunday 18 January 2026 02:11:24 -0500 (0:00:00.294) 0:26:45.975 ******** skipping: [managed-node9] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Sunday 18 January 2026 02:11:25 -0500 (0:00:00.292) 0:26:46.267 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Sunday 18 January 2026 02:11:25 -0500 (0:00:00.352) 0:26:46.620 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Sunday 18 January 2026 02:11:25 -0500 (0:00:00.259) 0:26:46.879 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Sunday 18 January 2026 02:11:25 -0500 (0:00:00.227) 0:26:47.107 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Sunday 18 January 2026 02:11:26 -0500 (0:00:00.280) 0:26:47.387 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Sunday 18 January 2026 02:11:26 -0500 (0:00:00.280) 0:26:47.668 ******** ok: [managed-node9] => { "storage_test_actual_size": { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Sunday 18 January 2026 02:11:26 -0500 (0:00:00.377) 0:26:48.046 ******** ok: [managed-node9] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Sunday 18 January 2026 02:11:27 -0500 (0:00:00.259) 0:26:48.305 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Sunday 18 January 2026 02:11:27 -0500 (0:00:00.539) 0:26:48.845 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Sunday 18 January 2026 02:11:27 -0500 (0:00:00.235) 0:26:49.080 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Sunday 18 January 2026 02:11:28 -0500 (0:00:00.178) 0:26:49.259 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Sunday 18 January 2026 02:11:28 -0500 (0:00:00.199) 0:26:49.459 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Sunday 18 January 2026 02:11:28 -0500 (0:00:00.182) 0:26:49.641 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Sunday 18 January 2026 02:11:28 -0500 (0:00:00.141) 0:26:49.782 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Sunday 18 January 2026 02:11:28 -0500 (0:00:00.150) 0:26:49.933 ******** skipping: [managed-node9] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Sunday 18 January 2026 02:11:28 -0500 (0:00:00.210) 0:26:50.143 ******** ok: [managed-node9] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Sunday 18 January 2026 02:11:29 -0500 (0:00:00.282) 0:26:50.425 ******** ok: [managed-node9] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } PLAY RECAP ********************************************************************* managed-node9 : ok=1245 changed=60 unreachable=0 failed=0 skipped=1073 rescued=18 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.17.14", "end_time": "2026-01-18T06:45:47.745162+00:00Z", "host": "managed-node9", "message": "encrypted volume 'foo' missing key/password", "start_time": "2026-01-18T06:45:45.501165+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2026-01-18T06:45:48.080713+00:00Z", "host": "managed-node9", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'foo' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-01-18T06:45:47.797707+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2026-01-18T06:47:57.749242+00:00Z", "host": "managed-node9", "message": "cannot remove existing formatting on device 'luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb' in safe mode due to encryption removal", "start_time": "2026-01-18T06:47:55.307443+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2026-01-18T06:47:58.172589+00:00Z", "host": "managed-node9", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-15e4a2f9-7dc9-4007-a374-efc5cd11b0eb' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-01-18T06:47:57.801566+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2026-01-18T06:49:49.437843+00:00Z", "host": "managed-node9", "message": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "start_time": "2026-01-18T06:49:47.195014+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2026-01-18T06:49:49.809613+00:00Z", "host": "managed-node9", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-01-18T06:49:49.510661+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2026-01-18T06:51:53.017628+00:00Z", "host": "managed-node9", "message": "encrypted volume 'test1' missing key/password", "start_time": "2026-01-18T06:51:50.456764+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2026-01-18T06:51:53.352716+00:00Z", "host": "managed-node9", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-01-18T06:51:53.086098+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2026-01-18T06:54:23.529781+00:00Z", "host": "managed-node9", "message": "cannot remove existing formatting on device 'luks-011d559f-bdd6-4d26-94dc-eebd371e9926' in safe mode due to encryption removal", "start_time": "2026-01-18T06:54:20.861195+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2026-01-18T06:54:23.898637+00:00Z", "host": "managed-node9", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-011d559f-bdd6-4d26-94dc-eebd371e9926' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-01-18T06:54:23.585479+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2026-01-18T06:56:47.563901+00:00Z", "host": "managed-node9", "message": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "start_time": "2026-01-18T06:56:45.111365+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2026-01-18T06:56:47.997946+00:00Z", "host": "managed-node9", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-01-18T06:56:47.632372+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2026-01-18T06:59:29.091493+00:00Z", "host": "managed-node9", "message": "encrypted volume 'test1' missing key/password", "start_time": "2026-01-18T06:59:26.483286+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2026-01-18T06:59:29.512701+00:00Z", "host": "managed-node9", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-01-18T06:59:29.192275+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2026-01-18T07:04:44.636760+00:00Z", "host": "managed-node9", "message": "cannot remove existing formatting on device 'luks-624bac9e-643d-439d-b06b-2978f908f664' in safe mode due to encryption removal", "start_time": "2026-01-18T07:04:41.779979+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2026-01-18T07:04:45.030743+00:00Z", "host": "managed-node9", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-624bac9e-643d-439d-b06b-2978f908f664' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-01-18T07:04:44.733643+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2026-01-18T07:07:34.245719+00:00Z", "host": "managed-node9", "message": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "start_time": "2026-01-18T07:07:31.443693+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2026-01-18T07:07:34.566977+00:00Z", "host": "managed-node9", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-01-18T07:07:34.304238+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Sunday 18 January 2026 02:11:29 -0500 (0:00:00.180) 0:26:50.605 ******** =============================================================================== fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.12s /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.81s /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.67s /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.27s /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.10s /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 11.49s /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab --- 6.20s /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 fedora.linux_system_roles.storage : Get service facts ------------------- 6.11s /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 fedora.linux_system_roles.storage : Get service facts ------------------- 6.07s /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Gathering Facts --------------------------------------------------------- 5.43s /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:2 fedora.linux_system_roles.storage : Get service facts ------------------- 5.18s /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 Write the key into the key file ----------------------------------------- 5.03s /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:319 fedora.linux_system_roles.storage : Get service facts ------------------- 4.61s /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:52 fedora.linux_system_roles.storage : Make sure blivet is available ------- 4.06s /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 3.96s /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Create a key file ------------------------------------------------------- 3.89s /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:312 Find unused disks in the system ----------------------------------------- 3.82s /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 3.81s /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Print out pool information ---------------------------------------------- 3.78s /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 fedora.linux_system_roles.storage : Set up new/current mounts ----------- 3.63s /tmp/collections-5Qw/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166