ansible-playbook [core 2.17.14] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-h00 executable location = /usr/local/bin/ansible-playbook python version = 3.12.12 (main, Mar 9 2026, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-14)] (/usr/bin/python3.12) jinja version = 3.1.6 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles statically imported: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_luks2.yml ****************************************************** 1 plays in /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml PLAY [Test LUKS2] ************************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:2 Wednesday 18 March 2026 20:17:42 -0400 (0:00:00.516) 0:00:00.516 ******* [WARNING]: Platform linux on host managed-node4 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. ok: [managed-node4] TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:20 Wednesday 18 March 2026 20:17:49 -0400 (0:00:06.700) 0:00:07.217 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:28 Wednesday 18 March 2026 20:17:49 -0400 (0:00:00.148) 0:00:07.365 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Enable FIPS mode - 2] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:39 Wednesday 18 March 2026 20:17:49 -0400 (0:00:00.208) 0:00:07.574 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Reboot - 2] ************************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:43 Wednesday 18 March 2026 20:17:49 -0400 (0:00:00.195) 0:00:07.769 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Ensure dracut-fips] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:53 Wednesday 18 March 2026 20:17:50 -0400 (0:00:00.196) 0:00:07.966 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Configure boot for FIPS] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:59 Wednesday 18 March 2026 20:17:50 -0400 (0:00:00.211) 0:00:08.177 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Reboot - 3] ************************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:68 Wednesday 18 March 2026 20:17:50 -0400 (0:00:00.297) 0:00:08.474 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:72 Wednesday 18 March 2026 20:17:50 -0400 (0:00:00.193) 0:00:08.668 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 TASK [Clear facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:10 Wednesday 18 March 2026 20:17:51 -0400 (0:00:00.392) 0:00:09.061 ******* META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:17:51 -0400 (0:00:00.025) 0:00:09.087 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:17:51 -0400 (0:00:00.293) 0:00:09.380 ******* included: fedora.linux_system_roles.storage for managed-node4 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:17:52 -0400 (0:00:01.102) 0:00:10.483 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:17:52 -0400 (0:00:00.277) 0:00:10.760 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:17:55 -0400 (0:00:02.153) 0:00:12.914 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:17:55 -0400 (0:00:00.621) 0:00:13.535 ******* ok: [managed-node4] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:17:58 -0400 (0:00:02.881) 0:00:16.417 ******* ok: [managed-node4] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:17:58 -0400 (0:00:00.212) 0:00:16.630 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:17:58 -0400 (0:00:00.168) 0:00:16.798 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:17:59 -0400 (0:00:00.176) 0:00:16.975 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:17:59 -0400 (0:00:00.716) 0:00:17.691 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:18:03 -0400 (0:00:04.118) 0:00:21.810 ******* ok: [managed-node4] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:18:04 -0400 (0:00:00.220) 0:00:22.031 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:18:04 -0400 (0:00:00.257) 0:00:22.288 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:18:08 -0400 (0:00:04.080) 0:00:26.369 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:18:08 -0400 (0:00:00.429) 0:00:26.799 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:18:09 -0400 (0:00:00.204) 0:00:27.003 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:18:09 -0400 (0:00:00.176) 0:00:27.180 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:18:09 -0400 (0:00:00.193) 0:00:27.373 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:18:11 -0400 (0:00:01.946) 0:00:29.320 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:18:16 -0400 (0:00:05.126) 0:00:34.447 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:18:17 -0400 (0:00:00.494) 0:00:34.950 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 20:18:18 -0400 (0:00:01.625) 0:00:36.575 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 20:18:18 -0400 (0:00:00.232) 0:00:36.808 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773879204.0216634, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "13fe4e8afb997f54092ca58fcbdc20404818d4db", "ctime": 1773879202.3926604, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 171966659, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773879202.3926604, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1408, "uid": 0, "version": "2477030431", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 20:18:20 -0400 (0:00:01.356) 0:00:38.165 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "blivet_output is changed", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:18:20 -0400 (0:00:00.230) 0:00:38.395 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 20:18:20 -0400 (0:00:00.466) 0:00:38.862 ******* ok: [managed-node4] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 20:18:21 -0400 (0:00:00.231) 0:00:39.094 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 20:18:21 -0400 (0:00:00.296) 0:00:39.390 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 20:18:21 -0400 (0:00:00.251) 0:00:39.641 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 20:18:21 -0400 (0:00:00.230) 0:00:39.872 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "blivet_output['mounts'] | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 20:18:22 -0400 (0:00:00.226) 0:00:40.099 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 20:18:22 -0400 (0:00:00.184) 0:00:40.284 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 20:18:22 -0400 (0:00:00.148) 0:00:40.433 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "blivet_output['mounts'] | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 20:18:22 -0400 (0:00:00.236) 0:00:40.669 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878307.7750347, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1773129418.917, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 4194436, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1773129111.523, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3423644330", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 20:18:23 -0400 (0:00:01.135) 0:00:41.805 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 20:18:24 -0400 (0:00:00.167) 0:00:41.972 ******* ok: [managed-node4] TASK [Get unused disks] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:75 Wednesday 18 March 2026 20:18:26 -0400 (0:00:01.941) 0:00:43.914 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed-node4 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Wednesday 18 March 2026 20:18:26 -0400 (0:00:00.529) 0:00:44.443 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Wednesday 18 March 2026 20:18:28 -0400 (0:00:01.910) 0:00:46.354 ******* ok: [managed-node4] => { "changed": false, "disks": [ "sda" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG_SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG_SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'type': 'disk', 'size': '268435456000', 'fstype': '', 'ssize': '512'}] has partitions" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Wednesday 18 March 2026 20:18:32 -0400 (0:00:03.562) 0:00:49.916 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "'Unable to find unused disk' in unused_disks_return.disks", "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Wednesday 18 March 2026 20:18:32 -0400 (0:00:00.190) 0:00:50.107 ******* ok: [managed-node4] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Wednesday 18 March 2026 20:18:32 -0400 (0:00:00.242) 0:00:50.350 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "unused_disks | d([]) | length < disks_needed | d(1)", "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Wednesday 18 March 2026 20:18:32 -0400 (0:00:00.234) 0:00:50.584 ******* ok: [managed-node4] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:84 Wednesday 18 March 2026 20:18:32 -0400 (0:00:00.267) 0:00:50.852 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node4 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 18 March 2026 20:18:33 -0400 (0:00:00.413) 0:00:51.266 ******* ok: [managed-node4] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 18 March 2026 20:18:33 -0400 (0:00:00.246) 0:00:51.512 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 TASK [Clear facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:10 Wednesday 18 March 2026 20:18:34 -0400 (0:00:00.478) 0:00:51.991 ******* META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:18:34 -0400 (0:00:00.002) 0:00:51.993 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:18:34 -0400 (0:00:00.294) 0:00:52.288 ******* included: fedora.linux_system_roles.storage for managed-node4 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:18:34 -0400 (0:00:00.367) 0:00:52.655 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:18:35 -0400 (0:00:00.251) 0:00:52.906 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:18:36 -0400 (0:00:01.974) 0:00:54.881 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:18:37 -0400 (0:00:00.433) 0:00:55.315 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:18:37 -0400 (0:00:00.276) 0:00:55.591 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:18:37 -0400 (0:00:00.171) 0:00:55.763 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:18:38 -0400 (0:00:00.187) 0:00:55.950 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:18:38 -0400 (0:00:00.167) 0:00:56.118 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:18:38 -0400 (0:00:00.579) 0:00:56.698 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:18:40 -0400 (0:00:01.941) 0:00:58.639 ******* ok: [managed-node4] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:18:41 -0400 (0:00:00.284) 0:00:58.923 ******* ok: [managed-node4] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:18:41 -0400 (0:00:00.338) 0:00:59.261 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:18:43 -0400 (0:00:02.057) 0:01:01.319 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:18:43 -0400 (0:00:00.480) 0:01:01.799 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:18:44 -0400 (0:00:00.190) 0:01:01.990 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:18:44 -0400 (0:00:00.229) 0:01:02.219 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:18:44 -0400 (0:00:00.170) 0:01:02.389 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:18:46 -0400 (0:00:02.038) 0:01:04.428 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:18:49 -0400 (0:00:02.985) 0:01:07.413 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:18:49 -0400 (0:00:00.443) 0:01:07.856 ******* fatal: [managed-node4]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 18 March 2026 20:18:52 -0400 (0:00:02.243) 0:01:10.100 ******* fatal: [managed-node4]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "encrypted volume 'foo' missing key/password", 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:18:52 -0400 (0:00:00.270) 0:01:10.371 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Wednesday 18 March 2026 20:18:52 -0400 (0:00:00.366) 0:01:10.738 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Wednesday 18 March 2026 20:18:53 -0400 (0:00:00.195) 0:01:10.933 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Wednesday 18 March 2026 20:18:53 -0400 (0:00:00.359) 0:01:11.293 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:100 Wednesday 18 March 2026 20:18:53 -0400 (0:00:00.215) 0:01:11.508 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 TASK [Clear facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:10 Wednesday 18 March 2026 20:18:54 -0400 (0:00:00.386) 0:01:11.894 ******* META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:18:54 -0400 (0:00:00.019) 0:01:11.914 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:18:54 -0400 (0:00:00.198) 0:01:12.113 ******* included: fedora.linux_system_roles.storage for managed-node4 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:18:54 -0400 (0:00:00.261) 0:01:12.375 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:18:54 -0400 (0:00:00.158) 0:01:12.534 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:18:56 -0400 (0:00:01.903) 0:01:14.437 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:18:57 -0400 (0:00:00.794) 0:01:15.231 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:18:57 -0400 (0:00:00.327) 0:01:15.558 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:18:57 -0400 (0:00:00.319) 0:01:15.878 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:18:58 -0400 (0:00:00.195) 0:01:16.074 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:18:58 -0400 (0:00:00.301) 0:01:16.375 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:18:59 -0400 (0:00:00.936) 0:01:17.312 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:19:01 -0400 (0:00:01.872) 0:01:19.185 ******* ok: [managed-node4] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:19:01 -0400 (0:00:00.265) 0:01:19.451 ******* ok: [managed-node4] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:19:01 -0400 (0:00:00.379) 0:01:19.830 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:19:04 -0400 (0:00:02.299) 0:01:22.130 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:19:04 -0400 (0:00:00.425) 0:01:22.555 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:19:04 -0400 (0:00:00.160) 0:01:22.716 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:19:05 -0400 (0:00:00.194) 0:01:22.910 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:19:05 -0400 (0:00:00.225) 0:01:23.136 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:19:07 -0400 (0:00:01.949) 0:01:25.085 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:19:10 -0400 (0:00:03.496) 0:01:28.582 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:19:12 -0400 (0:00:01.318) 0:01:29.901 ******* changed: [managed-node4] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 20:19:24 -0400 (0:00:12.412) 0:01:42.313 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 20:19:24 -0400 (0:00:00.154) 0:01:42.467 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773879204.0216634, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "13fe4e8afb997f54092ca58fcbdc20404818d4db", "ctime": 1773879202.3926604, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 171966659, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773879202.3926604, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1408, "uid": 0, "version": "2477030431", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 20:19:25 -0400 (0:00:01.040) 0:01:43.508 ******* ok: [managed-node4] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:19:28 -0400 (0:00:03.121) 0:01:46.629 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 20:19:29 -0400 (0:00:00.424) 0:01:47.054 ******* ok: [managed-node4] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 20:19:29 -0400 (0:00:00.295) 0:01:47.350 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 20:19:29 -0400 (0:00:00.264) 0:01:47.614 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 20:19:29 -0400 (0:00:00.198) 0:01:47.813 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 20:19:30 -0400 (0:00:00.242) 0:01:48.056 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 20:19:35 -0400 (0:00:05.471) 0:01:53.527 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node4] => (item={'src': '/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 20:19:38 -0400 (0:00:02.568) 0:01:56.096 ******* skipping: [managed-node4] => (item={'src': '/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 20:19:38 -0400 (0:00:00.340) 0:01:56.436 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 20:19:40 -0400 (0:00:01.536) 0:01:57.972 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773878307.7750347, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1773129418.917, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 4194436, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1773129111.523, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3423644330", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 20:19:41 -0400 (0:00:00.919) 0:01:58.892 ******* changed: [managed-node4] => (item={'backing_device': '/dev/sda', 'name': 'luks-2ff7d768-0179-41ab-9b55-845a651e9a17', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 20:19:41 -0400 (0:00:00.895) 0:01:59.787 ******* ok: [managed-node4] TASK [Verify role results] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:112 Wednesday 18 March 2026 20:19:43 -0400 (0:00:01.651) 0:02:01.439 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node4 TASK [Print out pool information] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 18 March 2026 20:19:44 -0400 (0:00:00.461) 0:02:01.901 ******* skipping: [managed-node4] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 18 March 2026 20:19:44 -0400 (0:00:00.133) 0:02:02.034 ******* ok: [managed-node4] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 18 March 2026 20:19:44 -0400 (0:00:00.162) 0:02:02.196 ******* ok: [managed-node4] => { "changed": false, "info": { "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "size": "10G", "type": "crypt", "uuid": "55937b04-c9aa-4420-982d-35d619f815ae" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "2ff7d768-0179-41ab-9b55-845a651e9a17" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "63c9c93d-438e-4c50-886b-6f947888611e" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 18 March 2026 20:19:46 -0400 (0:00:02.386) 0:02:04.582 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003054", "end": "2026-03-18 20:19:49.205231", "rc": 0, "start": "2026-03-18 20:19:49.202177" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Mar 10 07:51:51 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=63c9c93d-438e-4c50-886b-6f947888611e / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 18 March 2026 20:19:49 -0400 (0:00:02.632) 0:02:07.215 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003054", "end": "2026-03-18 20:19:50.324230", "failed_when_result": false, "rc": 0, "start": "2026-03-18 20:19:50.321176" } STDOUT: luks-2ff7d768-0179-41ab-9b55-845a651e9a17 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 18 March 2026 20:19:50 -0400 (0:00:01.123) 0:02:08.338 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 18 March 2026 20:19:50 -0400 (0:00:00.204) 0:02:08.542 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node4 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17', '_raw_device': '/dev/sda', '_mount_id': '/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 18 March 2026 20:19:51 -0400 (0:00:00.494) 0:02:09.036 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 18 March 2026 20:19:51 -0400 (0:00:00.307) 0:02:09.344 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node4 => (item=mount) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node4 => (item=fstab) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node4 => (item=fs) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node4 => (item=device) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node4 => (item=encryption) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node4 => (item=md) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node4 => (item=size) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node4 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 18 March 2026 20:19:53 -0400 (0:00:02.166) 0:02:11.510 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 18 March 2026 20:19:54 -0400 (0:00:00.434) 0:02:11.945 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 18 March 2026 20:19:54 -0400 (0:00:00.340) 0:02:12.286 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 18 March 2026 20:19:54 -0400 (0:00:00.347) 0:02:12.634 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 18 March 2026 20:19:55 -0400 (0:00:00.260) 0:02:12.895 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 18 March 2026 20:19:55 -0400 (0:00:00.422) 0:02:13.317 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 18 March 2026 20:19:55 -0400 (0:00:00.428) 0:02:13.745 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 18 March 2026 20:19:56 -0400 (0:00:00.323) 0:02:14.069 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 18 March 2026 20:19:56 -0400 (0:00:00.171) 0:02:14.240 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 18 March 2026 20:19:56 -0400 (0:00:00.229) 0:02:14.469 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 18 March 2026 20:19:56 -0400 (0:00:00.195) 0:02:14.665 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 18 March 2026 20:19:57 -0400 (0:00:00.255) 0:02:14.921 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 18 March 2026 20:19:58 -0400 (0:00:00.976) 0:02:15.897 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 18 March 2026 20:19:58 -0400 (0:00:00.350) 0:02:16.248 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 18 March 2026 20:19:58 -0400 (0:00:00.394) 0:02:16.642 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 18 March 2026 20:19:58 -0400 (0:00:00.183) 0:02:16.826 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 18 March 2026 20:19:59 -0400 (0:00:00.330) 0:02:17.157 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 18 March 2026 20:19:59 -0400 (0:00:00.223) 0:02:17.380 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 18 March 2026 20:19:59 -0400 (0:00:00.364) 0:02:17.745 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 18 March 2026 20:20:00 -0400 (0:00:00.358) 0:02:18.104 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773879563.8003614, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1773879563.8003614, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 450, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1773879563.8003614, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 18 March 2026 20:20:01 -0400 (0:00:01.227) 0:02:19.331 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 18 March 2026 20:20:01 -0400 (0:00:00.298) 0:02:19.629 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 18 March 2026 20:20:01 -0400 (0:00:00.252) 0:02:19.882 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 18 March 2026 20:20:02 -0400 (0:00:00.311) 0:02:20.193 ******* ok: [managed-node4] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 18 March 2026 20:20:02 -0400 (0:00:00.294) 0:02:20.488 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 18 March 2026 20:20:02 -0400 (0:00:00.289) 0:02:20.778 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 18 March 2026 20:20:03 -0400 (0:00:00.378) 0:02:21.156 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773879564.2573624, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1773879564.2573624, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1376, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1773879564.2573624, "nlink": 1, "path": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 18 March 2026 20:20:04 -0400 (0:00:01.189) 0:02:22.345 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 18 March 2026 20:20:06 -0400 (0:00:01.883) 0:02:24.229 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.007767", "end": "2026-03-18 20:20:07.315156", "rc": 0, "start": "2026-03-18 20:20:07.307389" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 2ff7d768-0179-41ab-9b55-845a651e9a17 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 675650 Threads: 2 Salt: d3 a8 b2 c6 64 39 90 af 82 0b 64 e3 bc c2 12 0b 1c d0 ed e7 c2 9b 93 de 26 6b c9 07 01 e3 b5 8f AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 131863 Salt: 09 38 fd 36 3e 6e a4 b7 a4 4d 9e 8f 87 bf e9 ef 61 73 79 c9 41 e9 b9 0b e1 c9 da 45 d5 65 35 4f Digest: 3f be d0 bc 7c 75 2d 91 52 54 a2 32 1f 01 27 ec b6 9b d4 f2 de 06 36 1e b6 09 3b 27 34 f0 12 f2 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 18 March 2026 20:20:07 -0400 (0:00:01.154) 0:02:25.383 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 18 March 2026 20:20:07 -0400 (0:00:00.279) 0:02:25.662 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 18 March 2026 20:20:08 -0400 (0:00:00.398) 0:02:26.061 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 18 March 2026 20:20:08 -0400 (0:00:00.250) 0:02:26.312 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 18 March 2026 20:20:08 -0400 (0:00:00.280) 0:02:26.592 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 18 March 2026 20:20:09 -0400 (0:00:00.510) 0:02:27.124 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 18 March 2026 20:20:09 -0400 (0:00:00.300) 0:02:27.425 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 18 March 2026 20:20:09 -0400 (0:00:00.374) 0:02:27.799 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-2ff7d768-0179-41ab-9b55-845a651e9a17 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 18 March 2026 20:20:10 -0400 (0:00:00.453) 0:02:28.253 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 18 March 2026 20:20:10 -0400 (0:00:00.303) 0:02:28.557 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 18 March 2026 20:20:11 -0400 (0:00:00.456) 0:02:29.014 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 18 March 2026 20:20:11 -0400 (0:00:00.327) 0:02:29.341 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 18 March 2026 20:20:11 -0400 (0:00:00.375) 0:02:29.717 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 18 March 2026 20:20:12 -0400 (0:00:00.217) 0:02:29.934 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 18 March 2026 20:20:12 -0400 (0:00:00.183) 0:02:30.118 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 18 March 2026 20:20:12 -0400 (0:00:00.261) 0:02:30.380 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 18 March 2026 20:20:12 -0400 (0:00:00.244) 0:02:30.624 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 18 March 2026 20:20:12 -0400 (0:00:00.166) 0:02:30.791 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 18 March 2026 20:20:13 -0400 (0:00:00.210) 0:02:31.002 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 18 March 2026 20:20:13 -0400 (0:00:00.289) 0:02:31.292 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 18 March 2026 20:20:13 -0400 (0:00:00.236) 0:02:31.528 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 18 March 2026 20:20:13 -0400 (0:00:00.182) 0:02:31.711 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 18 March 2026 20:20:14 -0400 (0:00:00.227) 0:02:31.939 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 18 March 2026 20:20:14 -0400 (0:00:00.209) 0:02:32.149 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 18 March 2026 20:20:14 -0400 (0:00:00.242) 0:02:32.392 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 18 March 2026 20:20:14 -0400 (0:00:00.182) 0:02:32.574 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 18 March 2026 20:20:15 -0400 (0:00:00.312) 0:02:32.887 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 18 March 2026 20:20:15 -0400 (0:00:00.274) 0:02:33.162 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 18 March 2026 20:20:15 -0400 (0:00:00.230) 0:02:33.392 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 18 March 2026 20:20:15 -0400 (0:00:00.219) 0:02:33.612 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 18 March 2026 20:20:15 -0400 (0:00:00.265) 0:02:33.877 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 18 March 2026 20:20:16 -0400 (0:00:00.194) 0:02:34.071 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 18 March 2026 20:20:16 -0400 (0:00:00.210) 0:02:34.282 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 18 March 2026 20:20:16 -0400 (0:00:00.250) 0:02:34.532 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 18 March 2026 20:20:16 -0400 (0:00:00.288) 0:02:34.820 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 18 March 2026 20:20:17 -0400 (0:00:00.293) 0:02:35.114 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 18 March 2026 20:20:17 -0400 (0:00:00.299) 0:02:35.413 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 18 March 2026 20:20:17 -0400 (0:00:00.338) 0:02:35.751 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 18 March 2026 20:20:18 -0400 (0:00:00.287) 0:02:36.039 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 18 March 2026 20:20:18 -0400 (0:00:00.245) 0:02:36.284 ******* skipping: [managed-node4] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 18 March 2026 20:20:18 -0400 (0:00:00.273) 0:02:36.605 ******* skipping: [managed-node4] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 18 March 2026 20:20:19 -0400 (0:00:00.308) 0:02:36.913 ******* skipping: [managed-node4] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 18 March 2026 20:20:19 -0400 (0:00:00.297) 0:02:37.210 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 18 March 2026 20:20:19 -0400 (0:00:00.425) 0:02:37.636 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 18 March 2026 20:20:20 -0400 (0:00:00.307) 0:02:37.943 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 18 March 2026 20:20:20 -0400 (0:00:00.208) 0:02:38.151 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 18 March 2026 20:20:20 -0400 (0:00:00.237) 0:02:38.389 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 18 March 2026 20:20:20 -0400 (0:00:00.275) 0:02:38.665 ******* ok: [managed-node4] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 18 March 2026 20:20:21 -0400 (0:00:00.253) 0:02:38.919 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 18 March 2026 20:20:21 -0400 (0:00:00.346) 0:02:39.265 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 18 March 2026 20:20:21 -0400 (0:00:00.234) 0:02:39.499 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 18 March 2026 20:20:21 -0400 (0:00:00.205) 0:02:39.705 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 18 March 2026 20:20:21 -0400 (0:00:00.171) 0:02:39.876 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 18 March 2026 20:20:22 -0400 (0:00:00.237) 0:02:40.114 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 18 March 2026 20:20:22 -0400 (0:00:00.223) 0:02:40.338 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 18 March 2026 20:20:22 -0400 (0:00:00.262) 0:02:40.601 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 18 March 2026 20:20:23 -0400 (0:00:00.956) 0:02:41.557 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 18 March 2026 20:20:23 -0400 (0:00:00.261) 0:02:41.819 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 18 March 2026 20:20:24 -0400 (0:00:00.228) 0:02:42.047 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 18 March 2026 20:20:24 -0400 (0:00:00.210) 0:02:42.257 ******* changed: [managed-node4] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:118 Wednesday 18 March 2026 20:20:27 -0400 (0:00:03.557) 0:02:45.815 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node4 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 18 March 2026 20:20:28 -0400 (0:00:00.648) 0:02:46.464 ******* ok: [managed-node4] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 18 March 2026 20:20:28 -0400 (0:00:00.358) 0:02:46.822 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 TASK [Clear facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:10 Wednesday 18 March 2026 20:20:29 -0400 (0:00:00.304) 0:02:47.126 ******* META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:20:29 -0400 (0:00:00.024) 0:02:47.151 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:20:29 -0400 (0:00:00.263) 0:02:47.415 ******* included: fedora.linux_system_roles.storage for managed-node4 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:20:29 -0400 (0:00:00.459) 0:02:47.874 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:20:30 -0400 (0:00:00.300) 0:02:48.175 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:20:32 -0400 (0:00:02.450) 0:02:50.626 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:20:33 -0400 (0:00:00.713) 0:02:51.340 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:20:33 -0400 (0:00:00.292) 0:02:51.632 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:20:34 -0400 (0:00:00.282) 0:02:51.915 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:20:34 -0400 (0:00:00.350) 0:02:52.266 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:20:34 -0400 (0:00:00.255) 0:02:52.522 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:20:35 -0400 (0:00:00.843) 0:02:53.365 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:20:37 -0400 (0:00:01.906) 0:02:55.271 ******* ok: [managed-node4] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:20:37 -0400 (0:00:00.339) 0:02:55.611 ******* ok: [managed-node4] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:20:38 -0400 (0:00:00.357) 0:02:55.968 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:20:40 -0400 (0:00:02.326) 0:02:58.294 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:20:40 -0400 (0:00:00.389) 0:02:58.684 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:20:40 -0400 (0:00:00.151) 0:02:58.835 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:20:41 -0400 (0:00:00.215) 0:02:59.051 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:20:41 -0400 (0:00:00.183) 0:02:59.234 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:20:43 -0400 (0:00:01.908) 0:03:01.143 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:20:46 -0400 (0:00:02.968) 0:03:04.154 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:20:46 -0400 (0:00:00.554) 0:03:04.709 ******* fatal: [managed-node4]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-2ff7d768-0179-41ab-9b55-845a651e9a17' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 18 March 2026 20:20:49 -0400 (0:00:02.485) 0:03:07.195 ******* fatal: [managed-node4]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'luks-2ff7d768-0179-41ab-9b55-845a651e9a17' in safe mode due to encryption removal", 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10720641024, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:20:49 -0400 (0:00:00.386) 0:03:07.581 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Wednesday 18 March 2026 20:20:50 -0400 (0:00:00.584) 0:03:08.165 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Wednesday 18 March 2026 20:20:50 -0400 (0:00:00.308) 0:03:08.474 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Wednesday 18 March 2026 20:20:51 -0400 (0:00:00.421) 0:03:08.896 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 18 March 2026 20:20:51 -0400 (0:00:00.288) 0:03:09.184 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773879627.7814884, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1773879627.7814884, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1773879627.7814884, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2365362547", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 18 March 2026 20:20:52 -0400 (0:00:01.347) 0:03:10.532 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:139 Wednesday 18 March 2026 20:20:53 -0400 (0:00:00.359) 0:03:10.892 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 TASK [Clear facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:10 Wednesday 18 March 2026 20:20:53 -0400 (0:00:00.570) 0:03:11.462 ******* META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:20:53 -0400 (0:00:00.016) 0:03:11.478 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:20:53 -0400 (0:00:00.252) 0:03:11.731 ******* included: fedora.linux_system_roles.storage for managed-node4 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:20:54 -0400 (0:00:00.412) 0:03:12.144 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:20:54 -0400 (0:00:00.252) 0:03:12.396 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:20:56 -0400 (0:00:02.265) 0:03:14.681 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:20:57 -0400 (0:00:00.841) 0:03:15.522 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:20:57 -0400 (0:00:00.230) 0:03:15.753 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:20:58 -0400 (0:00:00.288) 0:03:16.042 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:20:58 -0400 (0:00:00.243) 0:03:16.285 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:20:58 -0400 (0:00:00.239) 0:03:16.525 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:20:59 -0400 (0:00:00.829) 0:03:17.354 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:21:01 -0400 (0:00:02.035) 0:03:19.389 ******* ok: [managed-node4] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:21:01 -0400 (0:00:00.223) 0:03:19.613 ******* ok: [managed-node4] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:21:02 -0400 (0:00:00.271) 0:03:19.884 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:21:04 -0400 (0:00:02.251) 0:03:22.136 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:21:04 -0400 (0:00:00.480) 0:03:22.617 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:21:04 -0400 (0:00:00.204) 0:03:22.821 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:21:05 -0400 (0:00:00.282) 0:03:23.103 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:21:05 -0400 (0:00:00.142) 0:03:23.246 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:21:07 -0400 (0:00:01.908) 0:03:25.155 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:21:10 -0400 (0:00:02.908) 0:03:28.064 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:21:10 -0400 (0:00:00.597) 0:03:28.661 ******* changed: [managed-node4] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=113650d7-7735-43dd-953f-ae4a3ca1379e", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=113650d7-7735-43dd-953f-ae4a3ca1379e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 20:21:13 -0400 (0:00:02.753) 0:03:31.414 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 20:21:13 -0400 (0:00:00.255) 0:03:31.670 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773879578.0913887, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "f400a69f761aa033ef53c6b56f97bc2ecae3f83b", "ctime": 1773879578.0883887, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 171966659, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773879578.0883887, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1490, "uid": 0, "version": "2477030431", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 20:21:15 -0400 (0:00:01.404) 0:03:33.074 ******* ok: [managed-node4] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:21:16 -0400 (0:00:01.002) 0:03:34.077 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 20:21:16 -0400 (0:00:00.437) 0:03:34.514 ******* ok: [managed-node4] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=113650d7-7735-43dd-953f-ae4a3ca1379e", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=113650d7-7735-43dd-953f-ae4a3ca1379e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 20:21:16 -0400 (0:00:00.284) 0:03:34.799 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 20:21:17 -0400 (0:00:00.250) 0:03:35.049 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=113650d7-7735-43dd-953f-ae4a3ca1379e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 20:21:17 -0400 (0:00:00.378) 0:03:35.427 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node4] => (item={'src': '/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-2ff7d768-0179-41ab-9b55-845a651e9a17" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 20:21:18 -0400 (0:00:01.230) 0:03:36.658 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 20:21:20 -0400 (0:00:01.411) 0:03:38.069 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node4] => (item={'src': 'UUID=113650d7-7735-43dd-953f-ae4a3ca1379e', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=113650d7-7735-43dd-953f-ae4a3ca1379e", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=113650d7-7735-43dd-953f-ae4a3ca1379e" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 20:21:21 -0400 (0:00:01.635) 0:03:39.705 ******* skipping: [managed-node4] => (item={'src': 'UUID=113650d7-7735-43dd-953f-ae4a3ca1379e', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=113650d7-7735-43dd-953f-ae4a3ca1379e", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 20:21:22 -0400 (0:00:00.535) 0:03:40.241 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 20:21:23 -0400 (0:00:01.620) 0:03:41.861 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773879590.3224132, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "773448e186ed61c62782e89f1ca9b9086fd77910", "ctime": 1773879581.773396, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 671088843, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1773879581.7744596, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "2036644703", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 20:21:25 -0400 (0:00:01.337) 0:03:43.199 ******* changed: [managed-node4] => (item={'backing_device': '/dev/sda', 'name': 'luks-2ff7d768-0179-41ab-9b55-845a651e9a17', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 20:21:26 -0400 (0:00:01.465) 0:03:44.664 ******* ok: [managed-node4] TASK [Verify role results - 2] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:152 Wednesday 18 March 2026 20:21:28 -0400 (0:00:01.893) 0:03:46.558 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node4 TASK [Print out pool information] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 18 March 2026 20:21:29 -0400 (0:00:00.595) 0:03:47.153 ******* skipping: [managed-node4] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 18 March 2026 20:21:29 -0400 (0:00:00.225) 0:03:47.379 ******* ok: [managed-node4] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=113650d7-7735-43dd-953f-ae4a3ca1379e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 18 March 2026 20:21:29 -0400 (0:00:00.314) 0:03:47.694 ******* ok: [managed-node4] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "113650d7-7735-43dd-953f-ae4a3ca1379e" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "63c9c93d-438e-4c50-886b-6f947888611e" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 18 March 2026 20:21:31 -0400 (0:00:01.362) 0:03:49.056 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002971", "end": "2026-03-18 20:21:32.331590", "rc": 0, "start": "2026-03-18 20:21:32.328619" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Mar 10 07:51:51 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=63c9c93d-438e-4c50-886b-6f947888611e / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 UUID=113650d7-7735-43dd-953f-ae4a3ca1379e /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 18 March 2026 20:21:32 -0400 (0:00:01.302) 0:03:50.359 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002895", "end": "2026-03-18 20:21:33.547019", "failed_when_result": false, "rc": 0, "start": "2026-03-18 20:21:33.544124" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 18 March 2026 20:21:33 -0400 (0:00:01.166) 0:03:51.525 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 18 March 2026 20:21:33 -0400 (0:00:00.204) 0:03:51.730 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node4 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10720641024, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/sda', '_raw_device': '/dev/sda', '_mount_id': 'UUID=113650d7-7735-43dd-953f-ae4a3ca1379e', '_kernel_device': '/dev/sda', '_raw_kernel_device': '/dev/sda'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 18 March 2026 20:21:34 -0400 (0:00:00.573) 0:03:52.304 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 18 March 2026 20:21:34 -0400 (0:00:00.273) 0:03:52.578 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node4 => (item=mount) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node4 => (item=fstab) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node4 => (item=fs) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node4 => (item=device) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node4 => (item=encryption) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node4 => (item=md) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node4 => (item=size) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node4 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 18 March 2026 20:21:36 -0400 (0:00:02.261) 0:03:54.840 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 18 March 2026 20:21:37 -0400 (0:00:00.430) 0:03:55.271 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 18 March 2026 20:21:37 -0400 (0:00:00.403) 0:03:55.674 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 18 March 2026 20:21:38 -0400 (0:00:00.421) 0:03:56.096 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 18 March 2026 20:21:38 -0400 (0:00:00.345) 0:03:56.442 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 18 March 2026 20:21:38 -0400 (0:00:00.415) 0:03:56.857 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 18 March 2026 20:21:39 -0400 (0:00:00.334) 0:03:57.192 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 18 March 2026 20:21:39 -0400 (0:00:00.300) 0:03:57.493 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 18 March 2026 20:21:39 -0400 (0:00:00.207) 0:03:57.700 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 18 March 2026 20:21:40 -0400 (0:00:00.185) 0:03:57.885 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 18 March 2026 20:21:40 -0400 (0:00:00.300) 0:03:58.186 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 18 March 2026 20:21:40 -0400 (0:00:00.212) 0:03:58.398 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=113650d7-7735-43dd-953f-ae4a3ca1379e " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 18 March 2026 20:21:41 -0400 (0:00:00.636) 0:03:59.035 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 18 March 2026 20:21:41 -0400 (0:00:00.325) 0:03:59.360 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 18 March 2026 20:21:41 -0400 (0:00:00.297) 0:03:59.658 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 18 March 2026 20:21:41 -0400 (0:00:00.155) 0:03:59.814 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 18 March 2026 20:21:42 -0400 (0:00:00.321) 0:04:00.135 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 18 March 2026 20:21:42 -0400 (0:00:00.199) 0:04:00.335 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 18 March 2026 20:21:42 -0400 (0:00:00.496) 0:04:00.832 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 18 March 2026 20:21:43 -0400 (0:00:00.311) 0:04:01.144 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773879673.2575798, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1773879673.2575798, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 450, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1773879673.2575798, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 18 March 2026 20:21:44 -0400 (0:00:01.310) 0:04:02.454 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 18 March 2026 20:21:44 -0400 (0:00:00.266) 0:04:02.721 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 18 March 2026 20:21:45 -0400 (0:00:00.209) 0:04:02.930 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 18 March 2026 20:21:45 -0400 (0:00:00.325) 0:04:03.256 ******* ok: [managed-node4] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 18 March 2026 20:21:45 -0400 (0:00:00.250) 0:04:03.506 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 18 March 2026 20:21:45 -0400 (0:00:00.185) 0:04:03.692 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 18 March 2026 20:21:46 -0400 (0:00:00.958) 0:04:04.651 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 18 March 2026 20:21:46 -0400 (0:00:00.209) 0:04:04.861 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 18 March 2026 20:21:48 -0400 (0:00:01.886) 0:04:06.747 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 18 March 2026 20:21:49 -0400 (0:00:00.215) 0:04:06.962 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 18 March 2026 20:21:49 -0400 (0:00:00.258) 0:04:07.220 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 18 March 2026 20:21:49 -0400 (0:00:00.133) 0:04:07.354 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 18 March 2026 20:21:49 -0400 (0:00:00.177) 0:04:07.531 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 18 March 2026 20:21:49 -0400 (0:00:00.148) 0:04:07.680 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 18 March 2026 20:21:49 -0400 (0:00:00.160) 0:04:07.841 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 18 March 2026 20:21:50 -0400 (0:00:00.214) 0:04:08.055 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 18 March 2026 20:21:50 -0400 (0:00:00.247) 0:04:08.302 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 18 March 2026 20:21:50 -0400 (0:00:00.421) 0:04:08.724 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 18 March 2026 20:21:51 -0400 (0:00:00.381) 0:04:09.105 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 18 March 2026 20:21:51 -0400 (0:00:00.204) 0:04:09.309 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 18 March 2026 20:21:51 -0400 (0:00:00.272) 0:04:09.582 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 18 March 2026 20:21:51 -0400 (0:00:00.196) 0:04:09.778 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 18 March 2026 20:21:52 -0400 (0:00:00.244) 0:04:10.023 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 18 March 2026 20:21:52 -0400 (0:00:00.184) 0:04:10.208 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 18 March 2026 20:21:52 -0400 (0:00:00.214) 0:04:10.422 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 18 March 2026 20:21:52 -0400 (0:00:00.200) 0:04:10.622 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 18 March 2026 20:21:52 -0400 (0:00:00.218) 0:04:10.841 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 18 March 2026 20:21:53 -0400 (0:00:00.201) 0:04:11.042 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 18 March 2026 20:21:53 -0400 (0:00:00.201) 0:04:11.243 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 18 March 2026 20:21:53 -0400 (0:00:00.274) 0:04:11.517 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 18 March 2026 20:21:53 -0400 (0:00:00.216) 0:04:11.733 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 18 March 2026 20:21:54 -0400 (0:00:00.178) 0:04:11.912 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 18 March 2026 20:21:54 -0400 (0:00:00.139) 0:04:12.052 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 18 March 2026 20:21:54 -0400 (0:00:00.211) 0:04:12.264 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 18 March 2026 20:21:54 -0400 (0:00:00.207) 0:04:12.471 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 18 March 2026 20:21:54 -0400 (0:00:00.239) 0:04:12.711 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 18 March 2026 20:21:55 -0400 (0:00:00.259) 0:04:12.971 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 18 March 2026 20:21:55 -0400 (0:00:00.212) 0:04:13.184 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 18 March 2026 20:21:55 -0400 (0:00:00.228) 0:04:13.412 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 18 March 2026 20:21:55 -0400 (0:00:00.290) 0:04:13.702 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 18 March 2026 20:21:56 -0400 (0:00:00.212) 0:04:13.914 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 18 March 2026 20:21:56 -0400 (0:00:00.154) 0:04:14.068 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 18 March 2026 20:21:56 -0400 (0:00:00.466) 0:04:14.535 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 18 March 2026 20:21:56 -0400 (0:00:00.316) 0:04:14.852 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 18 March 2026 20:21:57 -0400 (0:00:00.299) 0:04:15.151 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 18 March 2026 20:21:57 -0400 (0:00:00.287) 0:04:15.439 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 18 March 2026 20:21:57 -0400 (0:00:00.238) 0:04:15.677 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 18 March 2026 20:21:58 -0400 (0:00:00.297) 0:04:15.974 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 18 March 2026 20:21:58 -0400 (0:00:00.304) 0:04:16.279 ******* skipping: [managed-node4] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 18 March 2026 20:21:58 -0400 (0:00:00.270) 0:04:16.550 ******* skipping: [managed-node4] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 18 March 2026 20:21:58 -0400 (0:00:00.238) 0:04:16.788 ******* skipping: [managed-node4] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 18 March 2026 20:21:59 -0400 (0:00:00.321) 0:04:17.109 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 18 March 2026 20:21:59 -0400 (0:00:00.262) 0:04:17.371 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 18 March 2026 20:21:59 -0400 (0:00:00.223) 0:04:17.595 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 18 March 2026 20:22:00 -0400 (0:00:00.346) 0:04:17.942 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 18 March 2026 20:22:00 -0400 (0:00:00.203) 0:04:18.145 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 18 March 2026 20:22:00 -0400 (0:00:00.287) 0:04:18.433 ******* ok: [managed-node4] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 18 March 2026 20:22:00 -0400 (0:00:00.368) 0:04:18.802 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 18 March 2026 20:22:01 -0400 (0:00:00.235) 0:04:19.037 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 18 March 2026 20:22:01 -0400 (0:00:00.231) 0:04:19.269 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 18 March 2026 20:22:01 -0400 (0:00:00.138) 0:04:19.407 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 18 March 2026 20:22:01 -0400 (0:00:00.225) 0:04:19.633 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 18 March 2026 20:22:01 -0400 (0:00:00.206) 0:04:19.839 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 18 March 2026 20:22:02 -0400 (0:00:00.232) 0:04:20.072 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 18 March 2026 20:22:02 -0400 (0:00:00.225) 0:04:20.298 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 18 March 2026 20:22:02 -0400 (0:00:00.216) 0:04:20.515 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 18 March 2026 20:22:02 -0400 (0:00:00.258) 0:04:20.773 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 18 March 2026 20:22:03 -0400 (0:00:00.146) 0:04:20.920 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 18 March 2026 20:22:03 -0400 (0:00:00.177) 0:04:21.097 ******* changed: [managed-node4] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 2] ****************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:158 Wednesday 18 March 2026 20:22:04 -0400 (0:00:01.175) 0:04:22.273 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node4 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 18 March 2026 20:22:04 -0400 (0:00:00.567) 0:04:22.841 ******* ok: [managed-node4] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 18 March 2026 20:22:05 -0400 (0:00:00.238) 0:04:23.079 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 TASK [Clear facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:10 Wednesday 18 March 2026 20:22:05 -0400 (0:00:00.251) 0:04:23.331 ******* META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:22:05 -0400 (0:00:00.002) 0:04:23.334 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:22:05 -0400 (0:00:00.246) 0:04:23.580 ******* included: fedora.linux_system_roles.storage for managed-node4 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:22:05 -0400 (0:00:00.289) 0:04:23.870 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:22:06 -0400 (0:00:00.238) 0:04:24.109 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:22:08 -0400 (0:00:01.912) 0:04:26.022 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:22:08 -0400 (0:00:00.504) 0:04:26.526 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:22:08 -0400 (0:00:00.186) 0:04:26.712 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:22:09 -0400 (0:00:00.194) 0:04:26.907 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:22:09 -0400 (0:00:00.105) 0:04:27.013 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:22:09 -0400 (0:00:00.167) 0:04:27.181 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:22:09 -0400 (0:00:00.546) 0:04:27.728 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:22:11 -0400 (0:00:01.713) 0:04:29.442 ******* ok: [managed-node4] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:22:11 -0400 (0:00:00.216) 0:04:29.658 ******* ok: [managed-node4] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:22:12 -0400 (0:00:00.306) 0:04:29.965 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:22:14 -0400 (0:00:02.168) 0:04:32.133 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:22:14 -0400 (0:00:00.659) 0:04:32.793 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:22:15 -0400 (0:00:00.153) 0:04:32.946 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:22:15 -0400 (0:00:00.229) 0:04:33.175 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:22:15 -0400 (0:00:00.191) 0:04:33.367 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:22:17 -0400 (0:00:01.963) 0:04:35.331 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d2ff7d768\\x2d0179\\x2d41ab\\x2d9b55\\x2d845a651e9a17.service": { "name": "systemd-cryptsetup@luks\\x2d2ff7d768\\x2d0179\\x2d41ab\\x2d9b55\\x2d845a651e9a17.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:22:20 -0400 (0:00:02.760) 0:04:38.091 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d2ff7d768\x2d0179\x2d41ab\x2d9b55\x2d845a651e9a17.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d2ff7d768\\x2d0179\\x2d41ab\\x2d9b55\\x2d845a651e9a17.service", "name": "systemd-cryptsetup@luks\\x2d2ff7d768\\x2d0179\\x2d41ab\\x2d9b55\\x2d845a651e9a17.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "\"system-systemd\\\\x2dcryptsetup.slice\" dev-sda.device systemd-journald.socket systemd-udevd-kernel.socket cryptsetup-pre.target", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "\"blockdev@dev-mapper-luks\\\\x2d2ff7d768\\\\x2d0179\\\\x2d41ab\\\\x2d9b55\\\\x2d845a651e9a17.target\" umount.target cryptsetup.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-2ff7d768-0179-41ab-9b55-845a651e9a17", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-2ff7d768-0179-41ab-9b55-845a651e9a17 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-2ff7d768-0179-41ab-9b55-845a651e9a17 /dev/sda - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-2ff7d768-0179-41ab-9b55-845a651e9a17 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-2ff7d768-0179-41ab-9b55-845a651e9a17 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d2ff7d768\\x2d0179\\x2d41ab\\x2d9b55\\x2d845a651e9a17.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d2ff7d768\\x2d0179\\x2d41ab\\x2d9b55\\x2d845a651e9a17.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d2ff7d768\\\\x2d0179\\\\x2d41ab\\\\x2d9b55\\\\x2d845a651e9a17.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2026-03-18 20:21:23 EDT", "StateChangeTimestampMonotonic": "1741190155", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d2ff7d768\\\\x2d0179\\\\x2d41ab\\\\x2d9b55\\\\x2d845a651e9a17.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:22:22 -0400 (0:00:01.895) 0:04:39.987 ******* fatal: [managed-node4]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 18 March 2026 20:22:24 -0400 (0:00:02.277) 0:04:42.265 ******* fatal: [managed-node4]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:22:24 -0400 (0:00:00.361) 0:04:42.626 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d2ff7d768\x2d0179\x2d41ab\x2d9b55\x2d845a651e9a17.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d2ff7d768\\x2d0179\\x2d41ab\\x2d9b55\\x2d845a651e9a17.service", "name": "systemd-cryptsetup@luks\\x2d2ff7d768\\x2d0179\\x2d41ab\\x2d9b55\\x2d845a651e9a17.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d2ff7d768\\x2d0179\\x2d41ab\\x2d9b55\\x2d845a651e9a17.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d2ff7d768\\x2d0179\\x2d41ab\\x2d9b55\\x2d845a651e9a17.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d2ff7d768\\x2d0179\\x2d41ab\\x2d9b55\\x2d845a651e9a17.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454168576", "LimitMEMLOCKSoft": "454168576", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d2ff7d768\\x2d0179\\x2d41ab\\x2d9b55\\x2d845a651e9a17.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d2ff7d768\\\\x2d0179\\\\x2d41ab\\\\x2d9b55\\\\x2d845a651e9a17.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Wednesday 18 March 2026 20:22:26 -0400 (0:00:02.016) 0:04:44.643 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Wednesday 18 March 2026 20:22:27 -0400 (0:00:00.268) 0:04:44.911 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Wednesday 18 March 2026 20:22:27 -0400 (0:00:00.404) 0:04:45.316 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 18 March 2026 20:22:27 -0400 (0:00:00.255) 0:04:45.572 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773879724.2096825, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1773879724.2096825, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1773879724.2096825, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1609518128", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 18 March 2026 20:22:29 -0400 (0:00:01.404) 0:04:46.976 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:179 Wednesday 18 March 2026 20:22:29 -0400 (0:00:00.326) 0:04:47.303 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 TASK [Clear facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:10 Wednesday 18 March 2026 20:22:30 -0400 (0:00:00.718) 0:04:48.022 ******* META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:22:30 -0400 (0:00:00.014) 0:04:48.036 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:22:30 -0400 (0:00:00.402) 0:04:48.439 ******* included: fedora.linux_system_roles.storage for managed-node4 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:22:30 -0400 (0:00:00.363) 0:04:48.803 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:22:31 -0400 (0:00:00.379) 0:04:49.183 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:22:33 -0400 (0:00:02.137) 0:04:51.320 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:22:34 -0400 (0:00:00.889) 0:04:52.210 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:22:34 -0400 (0:00:00.303) 0:04:52.513 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:22:34 -0400 (0:00:00.305) 0:04:52.819 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:22:35 -0400 (0:00:00.272) 0:04:53.091 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:22:35 -0400 (0:00:00.235) 0:04:53.327 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:22:36 -0400 (0:00:00.587) 0:04:53.914 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:22:38 -0400 (0:00:02.101) 0:04:56.016 ******* ok: [managed-node4] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:22:38 -0400 (0:00:00.237) 0:04:56.253 ******* ok: [managed-node4] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:22:38 -0400 (0:00:00.287) 0:04:56.541 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:22:40 -0400 (0:00:02.104) 0:04:58.646 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:22:41 -0400 (0:00:00.396) 0:04:59.043 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:22:41 -0400 (0:00:00.208) 0:04:59.252 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:22:41 -0400 (0:00:00.294) 0:04:59.546 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:22:41 -0400 (0:00:00.180) 0:04:59.727 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:22:43 -0400 (0:00:01.893) 0:05:01.620 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "running", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:22:46 -0400 (0:00:02.864) 0:05:04.485 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:22:47 -0400 (0:00:00.445) 0:05:04.931 ******* changed: [managed-node4] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=113650d7-7735-43dd-953f-ae4a3ca1379e", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 20:22:58 -0400 (0:00:11.881) 0:05:16.812 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 20:22:59 -0400 (0:00:00.262) 0:05:17.074 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773879681.6535966, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "4b8a2ae7a3ab32b0405209e09d61e86df5d98913", "ctime": 1773879681.6495967, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 171966659, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773879681.6495967, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1478, "uid": 0, "version": "2477030431", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 20:23:00 -0400 (0:00:01.214) 0:05:18.289 ******* ok: [managed-node4] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:23:01 -0400 (0:00:01.157) 0:05:19.446 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 20:23:01 -0400 (0:00:00.378) 0:05:19.825 ******* ok: [managed-node4] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=113650d7-7735-43dd-953f-ae4a3ca1379e", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 20:23:02 -0400 (0:00:00.375) 0:05:20.200 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 20:23:02 -0400 (0:00:00.312) 0:05:20.512 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 20:23:02 -0400 (0:00:00.337) 0:05:20.850 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node4] => (item={'src': 'UUID=113650d7-7735-43dd-953f-ae4a3ca1379e', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=113650d7-7735-43dd-953f-ae4a3ca1379e", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=113650d7-7735-43dd-953f-ae4a3ca1379e" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 20:23:06 -0400 (0:00:03.131) 0:05:23.981 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 20:23:07 -0400 (0:00:01.747) 0:05:25.729 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node4] => (item={'src': '/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 20:23:09 -0400 (0:00:01.551) 0:05:27.281 ******* skipping: [managed-node4] => (item={'src': '/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 20:23:09 -0400 (0:00:00.422) 0:05:27.703 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 20:23:11 -0400 (0:00:01.611) 0:05:29.315 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773879693.5456204, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1773879686.6306067, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 226492615, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1773879686.6312354, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3019504420", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 20:23:12 -0400 (0:00:01.219) 0:05:30.538 ******* changed: [managed-node4] => (item={'backing_device': '/dev/sda', 'name': 'luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 20:23:13 -0400 (0:00:01.327) 0:05:31.866 ******* ok: [managed-node4] TASK [Verify role results - 3] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:192 Wednesday 18 March 2026 20:23:16 -0400 (0:00:02.029) 0:05:33.896 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node4 TASK [Print out pool information] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 18 March 2026 20:23:16 -0400 (0:00:00.758) 0:05:34.655 ******* skipping: [managed-node4] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 18 March 2026 20:23:16 -0400 (0:00:00.194) 0:05:34.872 ******* ok: [managed-node4] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 18 March 2026 20:23:17 -0400 (0:00:00.307) 0:05:35.180 ******* ok: [managed-node4] => { "changed": false, "info": { "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "size": "10G", "type": "crypt", "uuid": "4a79b7c1-9ac4-442e-b6c1-1c38525e8a22" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "65b0a67e-d6c3-4a45-b2ff-091eb7e935e1" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "63c9c93d-438e-4c50-886b-6f947888611e" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 18 March 2026 20:23:18 -0400 (0:00:01.090) 0:05:36.271 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003009", "end": "2026-03-18 20:23:19.362771", "rc": 0, "start": "2026-03-18 20:23:19.359762" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Mar 10 07:51:51 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=63c9c93d-438e-4c50-886b-6f947888611e / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 18 March 2026 20:23:19 -0400 (0:00:01.162) 0:05:37.433 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:01.004666", "end": "2026-03-18 20:23:21.527966", "failed_when_result": false, "rc": 0, "start": "2026-03-18 20:23:20.523300" } STDOUT: luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 18 March 2026 20:23:21 -0400 (0:00:02.131) 0:05:39.565 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 18 March 2026 20:23:21 -0400 (0:00:00.218) 0:05:39.784 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node4 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1', '_raw_device': '/dev/sda', '_mount_id': '/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 18 March 2026 20:23:22 -0400 (0:00:00.494) 0:05:40.279 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 18 March 2026 20:23:22 -0400 (0:00:00.411) 0:05:40.691 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node4 => (item=mount) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node4 => (item=fstab) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node4 => (item=fs) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node4 => (item=device) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node4 => (item=encryption) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node4 => (item=md) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node4 => (item=size) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node4 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 18 March 2026 20:23:24 -0400 (0:00:01.744) 0:05:42.435 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 18 March 2026 20:23:24 -0400 (0:00:00.314) 0:05:42.749 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 18 March 2026 20:23:25 -0400 (0:00:00.404) 0:05:43.154 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 18 March 2026 20:23:25 -0400 (0:00:00.354) 0:05:43.508 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 18 March 2026 20:23:25 -0400 (0:00:00.319) 0:05:43.828 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 18 March 2026 20:23:26 -0400 (0:00:00.253) 0:05:44.081 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 18 March 2026 20:23:26 -0400 (0:00:00.374) 0:05:44.455 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 18 March 2026 20:23:26 -0400 (0:00:00.268) 0:05:44.724 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 18 March 2026 20:23:27 -0400 (0:00:00.203) 0:05:44.927 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 18 March 2026 20:23:27 -0400 (0:00:00.183) 0:05:45.110 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 18 March 2026 20:23:27 -0400 (0:00:00.161) 0:05:45.272 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 18 March 2026 20:23:27 -0400 (0:00:00.207) 0:05:45.480 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 18 March 2026 20:23:28 -0400 (0:00:00.795) 0:05:46.275 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 18 March 2026 20:23:28 -0400 (0:00:00.292) 0:05:46.567 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 18 March 2026 20:23:28 -0400 (0:00:00.195) 0:05:46.763 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 18 March 2026 20:23:29 -0400 (0:00:00.263) 0:05:47.026 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 18 March 2026 20:23:29 -0400 (0:00:00.238) 0:05:47.265 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 18 March 2026 20:23:29 -0400 (0:00:00.230) 0:05:47.495 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 18 March 2026 20:23:30 -0400 (0:00:00.389) 0:05:47.884 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 18 March 2026 20:23:30 -0400 (0:00:00.354) 0:05:48.238 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773879778.2637918, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1773879778.2637918, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 450, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1773879778.2637918, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 18 March 2026 20:23:31 -0400 (0:00:01.169) 0:05:49.408 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 18 March 2026 20:23:31 -0400 (0:00:00.224) 0:05:49.633 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 18 March 2026 20:23:31 -0400 (0:00:00.207) 0:05:49.841 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 18 March 2026 20:23:32 -0400 (0:00:00.236) 0:05:50.077 ******* ok: [managed-node4] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 18 March 2026 20:23:32 -0400 (0:00:00.313) 0:05:50.391 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 18 March 2026 20:23:32 -0400 (0:00:00.147) 0:05:50.539 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 18 March 2026 20:23:32 -0400 (0:00:00.281) 0:05:50.821 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773879778.7117927, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1773879778.7117927, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1504, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1773879778.7117927, "nlink": 1, "path": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 18 March 2026 20:23:34 -0400 (0:00:01.144) 0:05:51.966 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 18 March 2026 20:23:35 -0400 (0:00:01.754) 0:05:53.720 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.007214", "end": "2026-03-18 20:23:36.893216", "rc": 0, "start": "2026-03-18 20:23:36.886002" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 65b0a67e-d6c3-4a45-b2ff-091eb7e935e1 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 679944 Threads: 2 Salt: 72 b9 72 cd d2 c4 f6 79 ed cf c6 a0 b1 76 59 3b 01 69 69 40 ed 0f 6e d4 30 da 8d 8d b5 9d 59 1c AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 131072 Salt: 57 d0 3e fa 0e 03 9b 6b 04 44 92 6e 1c 71 77 d1 4f d0 d4 fa 9a 7a bc d2 55 d1 36 4e ab 1e 79 75 Digest: 5d 29 29 e3 2b 8a 19 29 db 35 a7 15 35 72 6f 76 a4 82 52 e8 a5 72 47 b9 3f 20 68 48 44 b2 11 e2 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 18 March 2026 20:23:37 -0400 (0:00:01.230) 0:05:54.951 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 18 March 2026 20:23:37 -0400 (0:00:00.304) 0:05:55.256 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 18 March 2026 20:23:37 -0400 (0:00:00.362) 0:05:55.618 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 18 March 2026 20:23:38 -0400 (0:00:00.267) 0:05:55.885 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 18 March 2026 20:23:38 -0400 (0:00:00.271) 0:05:56.156 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 18 March 2026 20:23:38 -0400 (0:00:00.432) 0:05:56.589 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 18 March 2026 20:23:39 -0400 (0:00:00.322) 0:05:56.912 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 18 March 2026 20:23:39 -0400 (0:00:00.293) 0:05:57.207 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 18 March 2026 20:23:39 -0400 (0:00:00.357) 0:05:57.564 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 18 March 2026 20:23:39 -0400 (0:00:00.288) 0:05:57.853 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 18 March 2026 20:23:40 -0400 (0:00:00.347) 0:05:58.201 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 18 March 2026 20:23:40 -0400 (0:00:00.393) 0:05:58.594 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 18 March 2026 20:23:41 -0400 (0:00:00.405) 0:05:59.000 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 18 March 2026 20:23:41 -0400 (0:00:00.277) 0:05:59.277 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 18 March 2026 20:23:41 -0400 (0:00:00.253) 0:05:59.530 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 18 March 2026 20:23:41 -0400 (0:00:00.237) 0:05:59.768 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 18 March 2026 20:23:42 -0400 (0:00:00.166) 0:05:59.934 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 18 March 2026 20:23:42 -0400 (0:00:00.236) 0:06:00.171 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 18 March 2026 20:23:42 -0400 (0:00:00.237) 0:06:00.409 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 18 March 2026 20:23:42 -0400 (0:00:00.323) 0:06:00.732 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 18 March 2026 20:23:43 -0400 (0:00:00.191) 0:06:00.923 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 18 March 2026 20:23:43 -0400 (0:00:00.242) 0:06:01.165 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 18 March 2026 20:23:43 -0400 (0:00:00.214) 0:06:01.380 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 18 March 2026 20:23:43 -0400 (0:00:00.179) 0:06:01.560 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 18 March 2026 20:23:43 -0400 (0:00:00.224) 0:06:01.785 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 18 March 2026 20:23:44 -0400 (0:00:00.205) 0:06:01.990 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 18 March 2026 20:23:44 -0400 (0:00:00.230) 0:06:02.221 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 18 March 2026 20:23:44 -0400 (0:00:00.310) 0:06:02.531 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 18 March 2026 20:23:44 -0400 (0:00:00.179) 0:06:02.711 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 18 March 2026 20:23:45 -0400 (0:00:00.217) 0:06:02.929 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 18 March 2026 20:23:45 -0400 (0:00:00.251) 0:06:03.180 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 18 March 2026 20:23:45 -0400 (0:00:00.268) 0:06:03.448 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 18 March 2026 20:23:45 -0400 (0:00:00.284) 0:06:03.733 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 18 March 2026 20:23:46 -0400 (0:00:00.380) 0:06:04.113 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 18 March 2026 20:23:46 -0400 (0:00:00.282) 0:06:04.396 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 18 March 2026 20:23:46 -0400 (0:00:00.263) 0:06:04.659 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 18 March 2026 20:23:47 -0400 (0:00:00.332) 0:06:04.992 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 18 March 2026 20:23:47 -0400 (0:00:00.241) 0:06:05.248 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 18 March 2026 20:23:47 -0400 (0:00:00.382) 0:06:05.630 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 18 March 2026 20:23:48 -0400 (0:00:00.310) 0:06:05.940 ******* skipping: [managed-node4] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 18 March 2026 20:23:48 -0400 (0:00:00.377) 0:06:06.318 ******* skipping: [managed-node4] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 18 March 2026 20:23:48 -0400 (0:00:00.271) 0:06:06.589 ******* skipping: [managed-node4] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 18 March 2026 20:23:49 -0400 (0:00:00.332) 0:06:06.922 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 18 March 2026 20:23:49 -0400 (0:00:00.532) 0:06:07.454 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 18 March 2026 20:23:49 -0400 (0:00:00.231) 0:06:07.686 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 18 March 2026 20:23:50 -0400 (0:00:00.280) 0:06:07.967 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 18 March 2026 20:23:50 -0400 (0:00:00.364) 0:06:08.331 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 18 March 2026 20:23:50 -0400 (0:00:00.277) 0:06:08.609 ******* ok: [managed-node4] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 18 March 2026 20:23:50 -0400 (0:00:00.220) 0:06:08.829 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 18 March 2026 20:23:51 -0400 (0:00:00.299) 0:06:09.129 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 18 March 2026 20:23:51 -0400 (0:00:00.272) 0:06:09.401 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 18 March 2026 20:23:51 -0400 (0:00:00.215) 0:06:09.616 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 18 March 2026 20:23:51 -0400 (0:00:00.107) 0:06:09.724 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 18 March 2026 20:23:52 -0400 (0:00:00.185) 0:06:09.909 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 18 March 2026 20:23:52 -0400 (0:00:00.161) 0:06:10.071 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 18 March 2026 20:23:52 -0400 (0:00:00.319) 0:06:10.390 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 18 March 2026 20:23:52 -0400 (0:00:00.264) 0:06:10.655 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 18 March 2026 20:23:52 -0400 (0:00:00.198) 0:06:10.854 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 18 March 2026 20:23:53 -0400 (0:00:00.319) 0:06:11.173 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Test for correct handling of new encrypted volume w/ no key - 2] ********* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:199 Wednesday 18 March 2026 20:23:53 -0400 (0:00:00.198) 0:06:11.372 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node4 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 18 March 2026 20:23:54 -0400 (0:00:00.669) 0:06:12.074 ******* ok: [managed-node4] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 18 March 2026 20:23:54 -0400 (0:00:00.404) 0:06:12.478 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 TASK [Clear facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:10 Wednesday 18 March 2026 20:23:55 -0400 (0:00:00.410) 0:06:12.889 ******* META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:23:55 -0400 (0:00:00.033) 0:06:12.922 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:23:55 -0400 (0:00:00.200) 0:06:13.122 ******* included: fedora.linux_system_roles.storage for managed-node4 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:23:56 -0400 (0:00:01.282) 0:06:14.405 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:23:56 -0400 (0:00:00.343) 0:06:14.749 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:23:59 -0400 (0:00:02.284) 0:06:17.034 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:24:00 -0400 (0:00:00.848) 0:06:17.882 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:24:00 -0400 (0:00:00.274) 0:06:18.156 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:24:00 -0400 (0:00:00.262) 0:06:18.419 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:24:00 -0400 (0:00:00.215) 0:06:18.634 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:24:00 -0400 (0:00:00.173) 0:06:18.807 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:24:01 -0400 (0:00:00.884) 0:06:19.692 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:24:03 -0400 (0:00:02.157) 0:06:21.850 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:24:04 -0400 (0:00:00.365) 0:06:22.215 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:24:04 -0400 (0:00:00.373) 0:06:22.589 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:24:06 -0400 (0:00:02.175) 0:06:24.765 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:24:07 -0400 (0:00:00.350) 0:06:25.115 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:24:07 -0400 (0:00:00.231) 0:06:25.347 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:24:07 -0400 (0:00:00.237) 0:06:25.585 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:24:07 -0400 (0:00:00.174) 0:06:25.759 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:24:09 -0400 (0:00:01.762) 0:06:27.522 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:24:13 -0400 (0:00:03.579) 0:06:31.101 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:24:13 -0400 (0:00:00.430) 0:06:31.532 ******* fatal: [managed-node4]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 18 March 2026 20:24:16 -0400 (0:00:02.437) 0:06:33.970 ******* fatal: [managed-node4]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "encrypted volume 'test1' missing key/password", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:24:16 -0400 (0:00:00.416) 0:06:34.386 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Wednesday 18 March 2026 20:24:16 -0400 (0:00:00.478) 0:06:34.865 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Wednesday 18 March 2026 20:24:17 -0400 (0:00:00.251) 0:06:35.116 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Wednesday 18 March 2026 20:24:17 -0400 (0:00:00.441) 0:06:35.557 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:219 Wednesday 18 March 2026 20:24:17 -0400 (0:00:00.282) 0:06:35.839 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 TASK [Clear facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:10 Wednesday 18 March 2026 20:24:18 -0400 (0:00:00.748) 0:06:36.588 ******* META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:24:18 -0400 (0:00:00.028) 0:06:36.616 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:24:19 -0400 (0:00:00.269) 0:06:36.886 ******* included: fedora.linux_system_roles.storage for managed-node4 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:24:19 -0400 (0:00:00.445) 0:06:37.331 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:24:19 -0400 (0:00:00.311) 0:06:37.642 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:24:21 -0400 (0:00:02.147) 0:06:39.793 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:24:22 -0400 (0:00:00.842) 0:06:40.636 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:24:23 -0400 (0:00:00.264) 0:06:40.901 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:24:23 -0400 (0:00:00.269) 0:06:41.170 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:24:23 -0400 (0:00:00.219) 0:06:41.389 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:24:23 -0400 (0:00:00.314) 0:06:41.703 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:24:24 -0400 (0:00:00.585) 0:06:42.289 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:24:25 -0400 (0:00:01.559) 0:06:43.848 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:24:26 -0400 (0:00:00.195) 0:06:44.043 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:24:26 -0400 (0:00:00.224) 0:06:44.268 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:24:28 -0400 (0:00:02.136) 0:06:46.404 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:24:28 -0400 (0:00:00.357) 0:06:46.762 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:24:29 -0400 (0:00:00.174) 0:06:46.937 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:24:29 -0400 (0:00:00.243) 0:06:47.181 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:24:29 -0400 (0:00:00.157) 0:06:47.339 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:24:31 -0400 (0:00:01.804) 0:06:49.143 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:24:33 -0400 (0:00:02.657) 0:06:51.800 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:24:34 -0400 (0:00:00.357) 0:06:52.158 ******* changed: [managed-node4] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-37471199-8d7b-4b98-b054-dea418ad7607", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 20:24:46 -0400 (0:00:12.526) 0:07:04.685 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 20:24:46 -0400 (0:00:00.195) 0:07:04.881 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773879789.236814, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "0214e2d632eeb23d7b712b669a3f2a5a2e701b77", "ctime": 1773879789.2328138, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 171966659, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773879789.2328138, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1490, "uid": 0, "version": "2477030431", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 20:24:48 -0400 (0:00:01.382) 0:07:06.264 ******* ok: [managed-node4] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:24:49 -0400 (0:00:01.277) 0:07:07.542 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 20:24:50 -0400 (0:00:00.559) 0:07:08.101 ******* ok: [managed-node4] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-37471199-8d7b-4b98-b054-dea418ad7607", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 20:24:50 -0400 (0:00:00.508) 0:07:08.610 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 20:24:51 -0400 (0:00:00.329) 0:07:08.939 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 20:24:51 -0400 (0:00:00.270) 0:07:09.210 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node4] => (item={'src': '/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 20:24:52 -0400 (0:00:01.614) 0:07:10.824 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 20:24:54 -0400 (0:00:01.701) 0:07:12.526 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node4] => (item={'src': '/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 20:24:56 -0400 (0:00:01.495) 0:07:14.021 ******* skipping: [managed-node4] => (item={'src': '/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 20:24:56 -0400 (0:00:00.406) 0:07:14.428 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 20:24:58 -0400 (0:00:01.687) 0:07:16.115 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773879800.5258367, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "7b7a6c264d9e2c5407ba668116dfa8ffc7943488", "ctime": 1773879793.802823, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 532676810, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1773879793.8033178, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "3115580947", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 20:24:59 -0400 (0:00:01.112) 0:07:17.227 ******* changed: [managed-node4] => (item={'backing_device': '/dev/sda', 'name': 'luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node4] => (item={'backing_device': '/dev/sda1', 'name': 'luks-37471199-8d7b-4b98-b054-dea418ad7607', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-37471199-8d7b-4b98-b054-dea418ad7607", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 20:25:01 -0400 (0:00:02.222) 0:07:19.450 ******* ok: [managed-node4] TASK [Verify role results - 4] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:236 Wednesday 18 March 2026 20:25:03 -0400 (0:00:02.136) 0:07:21.587 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node4 TASK [Print out pool information] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 18 March 2026 20:25:04 -0400 (0:00:00.701) 0:07:22.289 ******* ok: [managed-node4] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 18 March 2026 20:25:04 -0400 (0:00:00.325) 0:07:22.615 ******* skipping: [managed-node4] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 18 March 2026 20:25:04 -0400 (0:00:00.218) 0:07:22.833 ******* ok: [managed-node4] => { "changed": false, "info": { "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "size": "4G", "type": "crypt", "uuid": "4658a600-7be5-425d-9b67-4dc043f276da" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "37471199-8d7b-4b98-b054-dea418ad7607" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "63c9c93d-438e-4c50-886b-6f947888611e" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 18 March 2026 20:25:06 -0400 (0:00:01.146) 0:07:23.980 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:01.004042", "end": "2026-03-18 20:25:08.141520", "rc": 0, "start": "2026-03-18 20:25:07.137478" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Mar 10 07:51:51 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=63c9c93d-438e-4c50-886b-6f947888611e / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 18 March 2026 20:25:08 -0400 (0:00:02.218) 0:07:26.199 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003533", "end": "2026-03-18 20:25:09.349887", "failed_when_result": false, "rc": 0, "start": "2026-03-18 20:25:09.346354" } STDOUT: luks-37471199-8d7b-4b98-b054-dea418ad7607 /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 18 March 2026 20:25:09 -0400 (0:00:01.189) 0:07:27.389 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node4 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 18 March 2026 20:25:10 -0400 (0:00:00.584) 0:07:27.973 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 18 March 2026 20:25:10 -0400 (0:00:00.253) 0:07:28.227 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 18 March 2026 20:25:10 -0400 (0:00:00.263) 0:07:28.491 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 18 March 2026 20:25:10 -0400 (0:00:00.207) 0:07:28.698 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node4 => (item=members) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node4 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 18 March 2026 20:25:11 -0400 (0:00:00.682) 0:07:29.381 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 18 March 2026 20:25:11 -0400 (0:00:00.219) 0:07:29.600 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 18 March 2026 20:25:11 -0400 (0:00:00.178) 0:07:29.779 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 18 March 2026 20:25:12 -0400 (0:00:00.258) 0:07:30.038 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 18 March 2026 20:25:12 -0400 (0:00:00.151) 0:07:30.189 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 18 March 2026 20:25:12 -0400 (0:00:00.149) 0:07:30.339 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 18 March 2026 20:25:12 -0400 (0:00:00.174) 0:07:30.514 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 18 March 2026 20:25:12 -0400 (0:00:00.215) 0:07:30.730 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Wednesday 18 March 2026 20:25:13 -0400 (0:00:00.218) 0:07:30.949 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Wednesday 18 March 2026 20:25:13 -0400 (0:00:00.191) 0:07:31.140 ******* ok: [managed-node4] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:79075): WARNING **: 20:25:14.201: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.5 27 Jan 2026 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.221 originally 10.31.13.221 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.221 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.221 originally 10.31.13.221 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e0f745688c' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.221 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Wednesday 18 March 2026 20:25:14 -0400 (0:00:01.205) 0:07:32.346 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Wednesday 18 March 2026 20:25:14 -0400 (0:00:00.190) 0:07:32.536 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node4 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 18 March 2026 20:25:15 -0400 (0:00:00.688) 0:07:33.225 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 18 March 2026 20:25:15 -0400 (0:00:00.285) 0:07:33.511 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 18 March 2026 20:25:15 -0400 (0:00:00.158) 0:07:33.670 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 18 March 2026 20:25:15 -0400 (0:00:00.182) 0:07:33.853 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 18 March 2026 20:25:16 -0400 (0:00:00.179) 0:07:34.033 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 18 March 2026 20:25:16 -0400 (0:00:00.181) 0:07:34.214 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 18 March 2026 20:25:16 -0400 (0:00:00.208) 0:07:34.423 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 18 March 2026 20:25:16 -0400 (0:00:00.214) 0:07:34.638 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 18 March 2026 20:25:17 -0400 (0:00:00.299) 0:07:34.937 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 18 March 2026 20:25:17 -0400 (0:00:00.204) 0:07:35.142 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 18 March 2026 20:25:17 -0400 (0:00:00.224) 0:07:35.366 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Wednesday 18 March 2026 20:25:17 -0400 (0:00:00.253) 0:07:35.619 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node4 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 18 March 2026 20:25:18 -0400 (0:00:00.680) 0:07:36.300 ******* skipping: [managed-node4] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Wednesday 18 March 2026 20:25:18 -0400 (0:00:00.282) 0:07:36.582 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node4 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 18 March 2026 20:25:19 -0400 (0:00:00.594) 0:07:37.176 ******* skipping: [managed-node4] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Wednesday 18 March 2026 20:25:19 -0400 (0:00:00.237) 0:07:37.414 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node4 TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 18 March 2026 20:25:20 -0400 (0:00:00.682) 0:07:38.097 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 18 March 2026 20:25:20 -0400 (0:00:00.289) 0:07:38.387 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 18 March 2026 20:25:20 -0400 (0:00:00.237) 0:07:38.624 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 18 March 2026 20:25:21 -0400 (0:00:01.117) 0:07:39.741 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Wednesday 18 March 2026 20:25:22 -0400 (0:00:00.239) 0:07:39.981 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node4 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 18 March 2026 20:25:22 -0400 (0:00:00.607) 0:07:40.589 ******* skipping: [managed-node4] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Wednesday 18 March 2026 20:25:23 -0400 (0:00:00.330) 0:07:40.919 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node4 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 18 March 2026 20:25:23 -0400 (0:00:00.865) 0:07:41.784 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 18 March 2026 20:25:24 -0400 (0:00:00.217) 0:07:42.002 ******* skipping: [managed-node4] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Wednesday 18 March 2026 20:25:24 -0400 (0:00:00.170) 0:07:42.173 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Wednesday 18 March 2026 20:25:24 -0400 (0:00:00.268) 0:07:42.441 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Wednesday 18 March 2026 20:25:24 -0400 (0:00:00.246) 0:07:42.688 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Wednesday 18 March 2026 20:25:25 -0400 (0:00:00.211) 0:07:42.899 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Wednesday 18 March 2026 20:25:25 -0400 (0:00:00.287) 0:07:43.187 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Wednesday 18 March 2026 20:25:25 -0400 (0:00:00.235) 0:07:43.423 ******* ok: [managed-node4] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 18 March 2026 20:25:25 -0400 (0:00:00.262) 0:07:43.686 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node4 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 18 March 2026 20:25:26 -0400 (0:00:00.462) 0:07:44.148 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 18 March 2026 20:25:26 -0400 (0:00:00.293) 0:07:44.441 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node4 => (item=mount) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node4 => (item=fstab) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node4 => (item=fs) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node4 => (item=device) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node4 => (item=encryption) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node4 => (item=md) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node4 => (item=size) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node4 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 18 March 2026 20:25:28 -0400 (0:00:02.262) 0:07:46.704 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 18 March 2026 20:25:29 -0400 (0:00:00.412) 0:07:47.116 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 18 March 2026 20:25:29 -0400 (0:00:00.378) 0:07:47.494 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 18 March 2026 20:25:30 -0400 (0:00:00.414) 0:07:47.909 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 18 March 2026 20:25:30 -0400 (0:00:00.296) 0:07:48.205 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 18 March 2026 20:25:30 -0400 (0:00:00.359) 0:07:48.565 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 18 March 2026 20:25:30 -0400 (0:00:00.298) 0:07:48.863 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 18 March 2026 20:25:31 -0400 (0:00:00.340) 0:07:49.204 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 18 March 2026 20:25:31 -0400 (0:00:00.270) 0:07:49.475 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 18 March 2026 20:25:31 -0400 (0:00:00.191) 0:07:49.666 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 18 March 2026 20:25:32 -0400 (0:00:00.308) 0:07:49.974 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 18 March 2026 20:25:32 -0400 (0:00:00.317) 0:07:50.292 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 18 March 2026 20:25:32 -0400 (0:00:00.575) 0:07:50.868 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 18 March 2026 20:25:33 -0400 (0:00:00.262) 0:07:51.130 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 18 March 2026 20:25:33 -0400 (0:00:00.243) 0:07:51.374 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 18 March 2026 20:25:33 -0400 (0:00:00.130) 0:07:51.505 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 18 March 2026 20:25:33 -0400 (0:00:00.287) 0:07:51.792 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 18 March 2026 20:25:34 -0400 (0:00:00.247) 0:07:52.040 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 18 March 2026 20:25:34 -0400 (0:00:00.404) 0:07:52.444 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 18 March 2026 20:25:34 -0400 (0:00:00.284) 0:07:52.729 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773879886.0790098, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1773879886.0790098, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1615, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1773879886.0790098, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 18 March 2026 20:25:36 -0400 (0:00:01.313) 0:07:54.042 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 18 March 2026 20:25:36 -0400 (0:00:00.396) 0:07:54.439 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 18 March 2026 20:25:36 -0400 (0:00:00.195) 0:07:54.634 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 18 March 2026 20:25:36 -0400 (0:00:00.231) 0:07:54.866 ******* ok: [managed-node4] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 18 March 2026 20:25:37 -0400 (0:00:00.271) 0:07:55.138 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 18 March 2026 20:25:37 -0400 (0:00:00.175) 0:07:55.329 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 18 March 2026 20:25:37 -0400 (0:00:00.286) 0:07:55.615 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773879886.5590107, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1773879886.5590107, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1691, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1773879886.5590107, "nlink": 1, "path": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 18 March 2026 20:25:38 -0400 (0:00:01.206) 0:07:56.821 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 18 March 2026 20:25:40 -0400 (0:00:01.921) 0:07:58.743 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.007680", "end": "2026-03-18 20:25:42.000930", "rc": 0, "start": "2026-03-18 20:25:41.993250" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 37471199-8d7b-4b98-b054-dea418ad7607 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 681848 Threads: 2 Salt: c9 89 28 8a 91 a5 8d 34 1b 1a 96 f7 7d ac 90 1f 7f 93 bd d8 05 26 df 87 20 a4 77 87 27 2b 19 39 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 131730 Salt: d1 fe a2 62 20 ec 5a a6 7d 6e f7 0a c0 00 f3 d3 21 fc 41 ff 8f 0a 7a f0 cc 06 52 e8 9c 2b fa 02 Digest: d0 ef 7b 37 d5 7a ca d8 61 66 86 97 fa a9 dc 64 47 fa 4d 2b 14 86 3a d6 68 e9 23 9a 0b d9 07 2e TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 18 March 2026 20:25:42 -0400 (0:00:01.401) 0:08:00.145 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 18 March 2026 20:25:42 -0400 (0:00:00.374) 0:08:00.519 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 18 March 2026 20:25:43 -0400 (0:00:00.477) 0:08:00.996 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 18 March 2026 20:25:43 -0400 (0:00:00.290) 0:08:01.290 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 18 March 2026 20:25:43 -0400 (0:00:00.299) 0:08:01.590 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 18 March 2026 20:25:44 -0400 (0:00:00.471) 0:08:02.061 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 18 March 2026 20:25:44 -0400 (0:00:00.325) 0:08:02.387 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 18 March 2026 20:25:44 -0400 (0:00:00.364) 0:08:02.751 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-37471199-8d7b-4b98-b054-dea418ad7607 /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 18 March 2026 20:25:45 -0400 (0:00:00.548) 0:08:03.299 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 18 March 2026 20:25:45 -0400 (0:00:00.450) 0:08:03.750 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 18 March 2026 20:25:46 -0400 (0:00:00.419) 0:08:04.169 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 18 March 2026 20:25:46 -0400 (0:00:00.450) 0:08:04.620 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 18 March 2026 20:25:47 -0400 (0:00:00.583) 0:08:05.203 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 18 March 2026 20:25:47 -0400 (0:00:00.284) 0:08:05.488 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 18 March 2026 20:25:47 -0400 (0:00:00.266) 0:08:05.754 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 18 March 2026 20:25:48 -0400 (0:00:00.247) 0:08:06.002 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 18 March 2026 20:25:48 -0400 (0:00:00.244) 0:08:06.247 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 18 March 2026 20:25:48 -0400 (0:00:00.242) 0:08:06.489 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 18 March 2026 20:25:48 -0400 (0:00:00.203) 0:08:06.692 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 18 March 2026 20:25:49 -0400 (0:00:00.279) 0:08:06.971 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 18 March 2026 20:25:49 -0400 (0:00:00.212) 0:08:07.184 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 18 March 2026 20:25:49 -0400 (0:00:00.275) 0:08:07.460 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 18 March 2026 20:25:49 -0400 (0:00:00.226) 0:08:07.686 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 18 March 2026 20:25:50 -0400 (0:00:00.209) 0:08:07.895 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 18 March 2026 20:25:50 -0400 (0:00:00.256) 0:08:08.151 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 18 March 2026 20:25:50 -0400 (0:00:00.253) 0:08:08.405 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 18 March 2026 20:25:50 -0400 (0:00:00.231) 0:08:08.636 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 18 March 2026 20:25:51 -0400 (0:00:00.260) 0:08:08.896 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 18 March 2026 20:25:51 -0400 (0:00:00.194) 0:08:09.091 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 18 March 2026 20:25:51 -0400 (0:00:00.200) 0:08:09.291 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 18 March 2026 20:25:51 -0400 (0:00:00.201) 0:08:09.493 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 18 March 2026 20:25:51 -0400 (0:00:00.204) 0:08:09.697 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 18 March 2026 20:25:52 -0400 (0:00:00.194) 0:08:09.891 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 18 March 2026 20:25:52 -0400 (0:00:00.295) 0:08:10.187 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 18 March 2026 20:25:52 -0400 (0:00:00.331) 0:08:10.519 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 18 March 2026 20:25:52 -0400 (0:00:00.354) 0:08:10.874 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 18 March 2026 20:25:53 -0400 (0:00:00.395) 0:08:11.270 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 18 March 2026 20:25:53 -0400 (0:00:00.444) 0:08:11.714 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 18 March 2026 20:25:54 -0400 (0:00:00.293) 0:08:12.007 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 18 March 2026 20:25:54 -0400 (0:00:00.284) 0:08:12.291 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 18 March 2026 20:25:54 -0400 (0:00:00.262) 0:08:12.554 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 18 March 2026 20:25:54 -0400 (0:00:00.288) 0:08:12.842 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 18 March 2026 20:25:55 -0400 (0:00:00.380) 0:08:13.222 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 18 March 2026 20:25:55 -0400 (0:00:00.371) 0:08:13.594 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 18 March 2026 20:25:56 -0400 (0:00:00.342) 0:08:13.936 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 18 March 2026 20:25:56 -0400 (0:00:00.355) 0:08:14.291 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 18 March 2026 20:25:56 -0400 (0:00:00.385) 0:08:14.677 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 18 March 2026 20:25:57 -0400 (0:00:00.267) 0:08:14.945 ******* ok: [managed-node4] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 18 March 2026 20:25:57 -0400 (0:00:00.408) 0:08:15.353 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 18 March 2026 20:25:57 -0400 (0:00:00.305) 0:08:15.658 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 18 March 2026 20:25:58 -0400 (0:00:00.224) 0:08:15.883 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 18 March 2026 20:25:58 -0400 (0:00:00.227) 0:08:16.110 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 18 March 2026 20:25:58 -0400 (0:00:00.193) 0:08:16.304 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 18 March 2026 20:25:58 -0400 (0:00:00.259) 0:08:16.563 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 18 March 2026 20:25:58 -0400 (0:00:00.274) 0:08:16.838 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 18 March 2026 20:25:59 -0400 (0:00:00.177) 0:08:17.016 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 18 March 2026 20:25:59 -0400 (0:00:00.261) 0:08:17.277 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 18 March 2026 20:25:59 -0400 (0:00:00.294) 0:08:17.571 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 18 March 2026 20:26:00 -0400 (0:00:00.332) 0:08:17.903 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 18 March 2026 20:26:00 -0400 (0:00:00.236) 0:08:18.140 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 18 March 2026 20:26:00 -0400 (0:00:00.335) 0:08:18.476 ******* changed: [managed-node4] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 3] ****************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:242 Wednesday 18 March 2026 20:26:01 -0400 (0:00:01.267) 0:08:19.743 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node4 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 18 March 2026 20:26:02 -0400 (0:00:01.009) 0:08:20.753 ******* ok: [managed-node4] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 18 March 2026 20:26:03 -0400 (0:00:00.356) 0:08:21.109 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 TASK [Clear facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:10 Wednesday 18 March 2026 20:26:03 -0400 (0:00:00.380) 0:08:21.490 ******* META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:26:03 -0400 (0:00:00.024) 0:08:21.514 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:26:03 -0400 (0:00:00.335) 0:08:21.850 ******* included: fedora.linux_system_roles.storage for managed-node4 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:26:04 -0400 (0:00:00.351) 0:08:22.201 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:26:04 -0400 (0:00:00.279) 0:08:22.481 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:26:06 -0400 (0:00:02.323) 0:08:24.804 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:26:07 -0400 (0:00:00.708) 0:08:25.513 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:26:07 -0400 (0:00:00.313) 0:08:25.827 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:26:08 -0400 (0:00:00.259) 0:08:26.086 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:26:08 -0400 (0:00:00.218) 0:08:26.305 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:26:08 -0400 (0:00:00.266) 0:08:26.571 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:26:09 -0400 (0:00:00.830) 0:08:27.402 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:26:11 -0400 (0:00:01.985) 0:08:29.388 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:26:11 -0400 (0:00:00.261) 0:08:29.649 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:26:11 -0400 (0:00:00.204) 0:08:29.854 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:26:13 -0400 (0:00:01.941) 0:08:31.796 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:26:14 -0400 (0:00:00.482) 0:08:32.278 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:26:14 -0400 (0:00:00.177) 0:08:32.456 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:26:14 -0400 (0:00:00.232) 0:08:32.689 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:26:15 -0400 (0:00:00.232) 0:08:32.922 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:26:18 -0400 (0:00:03.856) 0:08:36.778 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d65b0a67e\\x2dd6c3\\x2d4a45\\x2db2ff\\x2d091eb7e935e1.service": { "name": "systemd-cryptsetup@luks\\x2d65b0a67e\\x2dd6c3\\x2d4a45\\x2db2ff\\x2d091eb7e935e1.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:26:22 -0400 (0:00:03.148) 0:08:39.926 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d65b0a67e\x2dd6c3\x2d4a45\x2db2ff\x2d091eb7e935e1.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d65b0a67e\\x2dd6c3\\x2d4a45\\x2db2ff\\x2d091eb7e935e1.service", "name": "systemd-cryptsetup@luks\\x2d65b0a67e\\x2dd6c3\\x2d4a45\\x2db2ff\\x2d091eb7e935e1.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "\"system-systemd\\\\x2dcryptsetup.slice\" cryptsetup-pre.target systemd-udevd-kernel.socket dev-sda.device systemd-journald.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target \"blockdev@dev-mapper-luks\\\\x2d65b0a67e\\\\x2dd6c3\\\\x2d4a45\\\\x2db2ff\\\\x2d091eb7e935e1.target\"", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1 /dev/sda - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-65b0a67e-d6c3-4a45-b2ff-091eb7e935e1 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d65b0a67e\\x2dd6c3\\x2d4a45\\x2db2ff\\x2d091eb7e935e1.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d65b0a67e\\x2dd6c3\\x2d4a45\\x2db2ff\\x2d091eb7e935e1.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d65b0a67e\\\\x2dd6c3\\\\x2d4a45\\\\x2db2ff\\\\x2d091eb7e935e1.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2026-03-18 20:24:58 EDT", "StateChangeTimestampMonotonic": "1955410316", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d65b0a67e\\\\x2dd6c3\\\\x2d4a45\\\\x2db2ff\\\\x2d091eb7e935e1.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:26:24 -0400 (0:00:02.160) 0:08:42.086 ******* fatal: [managed-node4]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-37471199-8d7b-4b98-b054-dea418ad7607' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 18 March 2026 20:26:26 -0400 (0:00:02.532) 0:08:44.619 ******* fatal: [managed-node4]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'luks-37471199-8d7b-4b98-b054-dea418ad7607' in safe mode due to encryption removal", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:26:27 -0400 (0:00:00.415) 0:08:45.034 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d65b0a67e\x2dd6c3\x2d4a45\x2db2ff\x2d091eb7e935e1.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d65b0a67e\\x2dd6c3\\x2d4a45\\x2db2ff\\x2d091eb7e935e1.service", "name": "systemd-cryptsetup@luks\\x2d65b0a67e\\x2dd6c3\\x2d4a45\\x2db2ff\\x2d091eb7e935e1.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d65b0a67e\\x2dd6c3\\x2d4a45\\x2db2ff\\x2d091eb7e935e1.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d65b0a67e\\x2dd6c3\\x2d4a45\\x2db2ff\\x2d091eb7e935e1.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d65b0a67e\\x2dd6c3\\x2d4a45\\x2db2ff\\x2d091eb7e935e1.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454168576", "LimitMEMLOCKSoft": "454168576", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d65b0a67e\\x2dd6c3\\x2d4a45\\x2db2ff\\x2d091eb7e935e1.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d65b0a67e\\\\x2dd6c3\\\\x2d4a45\\\\x2db2ff\\\\x2d091eb7e935e1.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Wednesday 18 March 2026 20:26:29 -0400 (0:00:01.952) 0:08:46.987 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Wednesday 18 March 2026 20:26:29 -0400 (0:00:00.245) 0:08:47.233 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Wednesday 18 March 2026 20:26:29 -0400 (0:00:00.435) 0:08:47.668 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 18 March 2026 20:26:30 -0400 (0:00:00.337) 0:08:48.005 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773879961.6661627, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1773879961.6661627, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1773879961.6661627, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2989918252", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 18 March 2026 20:26:31 -0400 (0:00:01.348) 0:08:49.354 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 2] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:267 Wednesday 18 March 2026 20:26:31 -0400 (0:00:00.306) 0:08:49.660 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 TASK [Clear facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:10 Wednesday 18 March 2026 20:26:32 -0400 (0:00:00.926) 0:08:50.586 ******* META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:26:32 -0400 (0:00:00.014) 0:08:50.601 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:26:33 -0400 (0:00:00.310) 0:08:50.912 ******* included: fedora.linux_system_roles.storage for managed-node4 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:26:33 -0400 (0:00:00.511) 0:08:51.423 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:26:33 -0400 (0:00:00.281) 0:08:51.704 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:26:35 -0400 (0:00:02.008) 0:08:53.712 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:26:36 -0400 (0:00:00.623) 0:08:54.336 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:26:36 -0400 (0:00:00.356) 0:08:54.692 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:26:37 -0400 (0:00:00.241) 0:08:54.934 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:26:37 -0400 (0:00:00.201) 0:08:55.136 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:26:37 -0400 (0:00:00.285) 0:08:55.421 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:26:38 -0400 (0:00:00.836) 0:08:56.258 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:26:40 -0400 (0:00:02.042) 0:08:58.300 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:26:40 -0400 (0:00:00.382) 0:08:58.683 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:26:41 -0400 (0:00:00.358) 0:08:59.042 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:26:43 -0400 (0:00:02.276) 0:09:01.318 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:26:43 -0400 (0:00:00.483) 0:09:01.801 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:26:44 -0400 (0:00:00.198) 0:09:02.000 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:26:44 -0400 (0:00:00.282) 0:09:02.282 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:26:44 -0400 (0:00:00.220) 0:09:02.502 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:26:46 -0400 (0:00:01.952) 0:09:04.455 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service": { "name": "systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:26:49 -0400 (0:00:02.887) 0:09:07.343 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d37471199\x2d8d7b\x2d4b98\x2db054\x2ddea418ad7607.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service", "name": "systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-udevd-kernel.socket dev-sda1.device \"system-systemd\\\\x2dcryptsetup.slice\" systemd-journald.socket cryptsetup-pre.target", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target \"blockdev@dev-mapper-luks\\\\x2d37471199\\\\x2d8d7b\\\\x2d4b98\\\\x2db054\\\\x2ddea418ad7607.target\" cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-37471199-8d7b-4b98-b054-dea418ad7607", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-37471199-8d7b-4b98-b054-dea418ad7607 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-37471199-8d7b-4b98-b054-dea418ad7607 /dev/sda1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-37471199-8d7b-4b98-b054-dea418ad7607 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-37471199-8d7b-4b98-b054-dea418ad7607 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d37471199\\\\x2d8d7b\\\\x2d4b98\\\\x2db054\\\\x2ddea418ad7607.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target \"dev-mapper-luks\\\\x2d37471199\\\\x2d8d7b\\\\x2d4b98\\\\x2db054\\\\x2ddea418ad7607.device\"", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2026-03-18 20:26:28 EDT", "StateChangeTimestampMonotonic": "2046190522", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d37471199\\\\x2d8d7b\\\\x2d4b98\\\\x2db054\\\\x2ddea418ad7607.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:26:51 -0400 (0:00:01.832) 0:09:09.175 ******* changed: [managed-node4] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-37471199-8d7b-4b98-b054-dea418ad7607", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 20:26:54 -0400 (0:00:02.926) 0:09:12.102 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 20:26:54 -0400 (0:00:00.239) 0:09:12.342 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773879895.99503, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "f6d985f4439632af52a88b7a9a95b39d4190d8f3", "ctime": 1773879895.99103, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 171966659, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773879895.99103, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1490, "uid": 0, "version": "2477030431", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 20:26:55 -0400 (0:00:01.275) 0:09:13.617 ******* ok: [managed-node4] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:26:56 -0400 (0:00:01.247) 0:09:14.865 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d37471199\x2d8d7b\x2d4b98\x2db054\x2ddea418ad7607.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service", "name": "systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454168576", "LimitMEMLOCKSoft": "454168576", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d37471199\\\\x2d8d7b\\\\x2d4b98\\\\x2db054\\\\x2ddea418ad7607.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target \"dev-mapper-luks\\\\x2d37471199\\\\x2d8d7b\\\\x2d4b98\\\\x2db054\\\\x2ddea418ad7607.device\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2026-03-18 20:26:28 EDT", "StateChangeTimestampMonotonic": "2046190522", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 20:26:59 -0400 (0:00:02.067) 0:09:16.932 ******* ok: [managed-node4] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-37471199-8d7b-4b98-b054-dea418ad7607", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 20:26:59 -0400 (0:00:00.403) 0:09:17.335 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 20:26:59 -0400 (0:00:00.325) 0:09:17.660 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 20:27:00 -0400 (0:00:00.275) 0:09:17.936 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node4] => (item={'src': '/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-37471199-8d7b-4b98-b054-dea418ad7607" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 20:27:01 -0400 (0:00:01.685) 0:09:19.621 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 20:27:03 -0400 (0:00:01.643) 0:09:21.265 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node4] => (item={'src': 'UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 20:27:04 -0400 (0:00:01.450) 0:09:22.715 ******* skipping: [managed-node4] => (item={'src': 'UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 20:27:05 -0400 (0:00:00.316) 0:09:23.032 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 20:27:07 -0400 (0:00:01.878) 0:09:24.910 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773879909.3480568, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "3c9083ad543d3c8c04b19a6e4ad3e2cafe24ac3a", "ctime": 1773879901.419041, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 79691982, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1773879901.4200673, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "4209227579", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 20:27:08 -0400 (0:00:01.200) 0:09:26.111 ******* changed: [managed-node4] => (item={'backing_device': '/dev/sda1', 'name': 'luks-37471199-8d7b-4b98-b054-dea418ad7607', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-37471199-8d7b-4b98-b054-dea418ad7607", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 20:27:09 -0400 (0:00:01.571) 0:09:27.682 ******* ok: [managed-node4] TASK [Verify role results - 5] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:284 Wednesday 18 March 2026 20:27:11 -0400 (0:00:02.126) 0:09:29.808 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node4 TASK [Print out pool information] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 18 March 2026 20:27:13 -0400 (0:00:01.241) 0:09:31.050 ******* ok: [managed-node4] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 18 March 2026 20:27:13 -0400 (0:00:00.314) 0:09:31.364 ******* skipping: [managed-node4] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 18 March 2026 20:27:13 -0400 (0:00:00.328) 0:09:31.693 ******* ok: [managed-node4] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "d4a68fbc-1ba8-41a5-96a0-c9be988b04d7" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "63c9c93d-438e-4c50-886b-6f947888611e" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 18 March 2026 20:27:15 -0400 (0:00:01.292) 0:09:32.985 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002956", "end": "2026-03-18 20:27:16.048684", "rc": 0, "start": "2026-03-18 20:27:16.045728" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Mar 10 07:51:51 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=63c9c93d-438e-4c50-886b-6f947888611e / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 18 March 2026 20:27:16 -0400 (0:00:01.100) 0:09:34.086 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002957", "end": "2026-03-18 20:27:17.309546", "failed_when_result": false, "rc": 0, "start": "2026-03-18 20:27:17.306589" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 18 March 2026 20:27:17 -0400 (0:00:01.236) 0:09:35.322 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node4 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 18 March 2026 20:27:17 -0400 (0:00:00.397) 0:09:35.720 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 18 March 2026 20:27:18 -0400 (0:00:00.253) 0:09:35.974 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 18 March 2026 20:27:18 -0400 (0:00:00.200) 0:09:36.174 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 18 March 2026 20:27:18 -0400 (0:00:00.241) 0:09:36.416 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node4 => (item=members) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node4 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 18 March 2026 20:27:19 -0400 (0:00:00.666) 0:09:37.082 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 18 March 2026 20:27:19 -0400 (0:00:00.176) 0:09:37.259 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 18 March 2026 20:27:19 -0400 (0:00:00.171) 0:09:37.431 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 18 March 2026 20:27:19 -0400 (0:00:00.221) 0:09:37.653 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 18 March 2026 20:27:20 -0400 (0:00:00.229) 0:09:37.882 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 18 March 2026 20:27:20 -0400 (0:00:00.170) 0:09:38.052 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 18 March 2026 20:27:20 -0400 (0:00:00.223) 0:09:38.276 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 18 March 2026 20:27:20 -0400 (0:00:00.248) 0:09:38.524 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Wednesday 18 March 2026 20:27:20 -0400 (0:00:00.192) 0:09:38.716 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Wednesday 18 March 2026 20:27:21 -0400 (0:00:00.214) 0:09:38.931 ******* ok: [managed-node4] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:85202): WARNING **: 20:27:22.185: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.5 27 Jan 2026 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.221 originally 10.31.13.221 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.221 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.221 originally 10.31.13.221 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e0f745688c' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.221 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Wednesday 18 March 2026 20:27:22 -0400 (0:00:01.423) 0:09:40.354 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Wednesday 18 March 2026 20:27:22 -0400 (0:00:00.245) 0:09:40.600 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node4 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 18 March 2026 20:27:23 -0400 (0:00:00.656) 0:09:41.256 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 18 March 2026 20:27:23 -0400 (0:00:00.174) 0:09:41.450 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 18 March 2026 20:27:23 -0400 (0:00:00.224) 0:09:41.675 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 18 March 2026 20:27:23 -0400 (0:00:00.181) 0:09:41.856 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 18 March 2026 20:27:24 -0400 (0:00:00.226) 0:09:42.082 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 18 March 2026 20:27:24 -0400 (0:00:00.226) 0:09:42.309 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 18 March 2026 20:27:24 -0400 (0:00:00.270) 0:09:42.587 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 18 March 2026 20:27:24 -0400 (0:00:00.230) 0:09:42.817 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 18 March 2026 20:27:25 -0400 (0:00:00.283) 0:09:43.101 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 18 March 2026 20:27:25 -0400 (0:00:00.180) 0:09:43.281 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 18 March 2026 20:27:25 -0400 (0:00:00.357) 0:09:43.639 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Wednesday 18 March 2026 20:27:26 -0400 (0:00:00.282) 0:09:43.922 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node4 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 18 March 2026 20:27:26 -0400 (0:00:00.645) 0:09:44.568 ******* skipping: [managed-node4] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Wednesday 18 March 2026 20:27:26 -0400 (0:00:00.311) 0:09:44.879 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node4 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 18 March 2026 20:27:27 -0400 (0:00:00.569) 0:09:45.449 ******* skipping: [managed-node4] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Wednesday 18 March 2026 20:27:27 -0400 (0:00:00.376) 0:09:45.825 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node4 TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 18 March 2026 20:27:28 -0400 (0:00:00.688) 0:09:46.514 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 18 March 2026 20:27:28 -0400 (0:00:00.296) 0:09:46.811 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 18 March 2026 20:27:29 -0400 (0:00:00.193) 0:09:47.004 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 18 March 2026 20:27:29 -0400 (0:00:00.228) 0:09:47.232 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Wednesday 18 March 2026 20:27:29 -0400 (0:00:00.270) 0:09:47.503 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node4 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 18 March 2026 20:27:30 -0400 (0:00:00.597) 0:09:48.101 ******* skipping: [managed-node4] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Wednesday 18 March 2026 20:27:30 -0400 (0:00:00.319) 0:09:48.420 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node4 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 18 March 2026 20:27:31 -0400 (0:00:00.851) 0:09:49.272 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 18 March 2026 20:27:31 -0400 (0:00:00.276) 0:09:49.548 ******* skipping: [managed-node4] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Wednesday 18 March 2026 20:27:32 -0400 (0:00:01.267) 0:09:50.816 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Wednesday 18 March 2026 20:27:33 -0400 (0:00:00.231) 0:09:51.047 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Wednesday 18 March 2026 20:27:33 -0400 (0:00:00.275) 0:09:51.323 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Wednesday 18 March 2026 20:27:33 -0400 (0:00:00.306) 0:09:51.629 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Wednesday 18 March 2026 20:27:34 -0400 (0:00:00.266) 0:09:51.896 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Wednesday 18 March 2026 20:27:34 -0400 (0:00:00.292) 0:09:52.188 ******* ok: [managed-node4] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 18 March 2026 20:27:34 -0400 (0:00:00.304) 0:09:52.493 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node4 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 18 March 2026 20:27:35 -0400 (0:00:00.553) 0:09:53.046 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 18 March 2026 20:27:35 -0400 (0:00:00.413) 0:09:53.459 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node4 => (item=mount) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node4 => (item=fstab) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node4 => (item=fs) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node4 => (item=device) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node4 => (item=encryption) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node4 => (item=md) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node4 => (item=size) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node4 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 18 March 2026 20:27:37 -0400 (0:00:02.295) 0:09:55.755 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 18 March 2026 20:27:38 -0400 (0:00:00.486) 0:09:56.241 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 18 March 2026 20:27:38 -0400 (0:00:00.410) 0:09:56.651 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 18 March 2026 20:27:39 -0400 (0:00:00.553) 0:09:57.204 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 18 March 2026 20:27:39 -0400 (0:00:00.424) 0:09:57.629 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 18 March 2026 20:27:40 -0400 (0:00:00.377) 0:09:58.006 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 18 March 2026 20:27:40 -0400 (0:00:00.373) 0:09:58.380 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 18 March 2026 20:27:40 -0400 (0:00:00.415) 0:09:58.795 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 18 March 2026 20:27:41 -0400 (0:00:00.319) 0:09:59.115 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 18 March 2026 20:27:41 -0400 (0:00:00.279) 0:09:59.394 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 18 March 2026 20:27:41 -0400 (0:00:00.232) 0:09:59.626 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 18 March 2026 20:27:42 -0400 (0:00:00.262) 0:09:59.889 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 18 March 2026 20:27:42 -0400 (0:00:00.562) 0:10:00.452 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 18 March 2026 20:27:42 -0400 (0:00:00.393) 0:10:00.846 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 18 March 2026 20:27:43 -0400 (0:00:00.379) 0:10:01.225 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 18 March 2026 20:27:43 -0400 (0:00:00.211) 0:10:01.436 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 18 March 2026 20:27:44 -0400 (0:00:00.450) 0:10:01.887 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 18 March 2026 20:27:44 -0400 (0:00:00.328) 0:10:02.215 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 18 March 2026 20:27:44 -0400 (0:00:00.461) 0:10:02.677 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 18 March 2026 20:27:45 -0400 (0:00:00.499) 0:10:03.177 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880013.9662669, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1773880013.9662669, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1884, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1773880013.9662669, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 18 March 2026 20:27:46 -0400 (0:00:01.397) 0:10:04.574 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 18 March 2026 20:27:47 -0400 (0:00:00.425) 0:10:05.000 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 18 March 2026 20:27:47 -0400 (0:00:00.254) 0:10:05.254 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 18 March 2026 20:27:47 -0400 (0:00:00.389) 0:10:05.644 ******* ok: [managed-node4] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 18 March 2026 20:27:48 -0400 (0:00:00.333) 0:10:05.978 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 18 March 2026 20:27:48 -0400 (0:00:00.274) 0:10:06.288 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 18 March 2026 20:27:48 -0400 (0:00:00.348) 0:10:06.636 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 18 March 2026 20:27:49 -0400 (0:00:00.269) 0:10:06.905 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 18 March 2026 20:27:51 -0400 (0:00:01.983) 0:10:08.888 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 18 March 2026 20:27:51 -0400 (0:00:00.184) 0:10:09.074 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 18 March 2026 20:27:51 -0400 (0:00:00.230) 0:10:09.304 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 18 March 2026 20:27:51 -0400 (0:00:00.444) 0:10:09.748 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 18 March 2026 20:27:52 -0400 (0:00:00.247) 0:10:09.996 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 18 March 2026 20:27:52 -0400 (0:00:00.220) 0:10:10.216 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 18 March 2026 20:27:52 -0400 (0:00:00.272) 0:10:10.489 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 18 March 2026 20:27:52 -0400 (0:00:00.223) 0:10:10.712 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 18 March 2026 20:27:53 -0400 (0:00:00.228) 0:10:10.941 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 18 March 2026 20:27:53 -0400 (0:00:00.400) 0:10:11.342 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 18 March 2026 20:27:53 -0400 (0:00:00.389) 0:10:11.731 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 18 March 2026 20:27:54 -0400 (0:00:00.254) 0:10:11.985 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 18 March 2026 20:27:54 -0400 (0:00:00.406) 0:10:12.392 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 18 March 2026 20:27:54 -0400 (0:00:00.247) 0:10:12.639 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 18 March 2026 20:27:55 -0400 (0:00:00.335) 0:10:12.975 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 18 March 2026 20:27:55 -0400 (0:00:00.226) 0:10:13.201 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 18 March 2026 20:27:55 -0400 (0:00:00.206) 0:10:13.407 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 18 March 2026 20:27:55 -0400 (0:00:00.217) 0:10:13.625 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 18 March 2026 20:27:55 -0400 (0:00:00.179) 0:10:13.804 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 18 March 2026 20:27:56 -0400 (0:00:00.241) 0:10:14.046 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 18 March 2026 20:27:56 -0400 (0:00:00.233) 0:10:14.280 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 18 March 2026 20:27:56 -0400 (0:00:00.224) 0:10:14.505 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 18 March 2026 20:27:56 -0400 (0:00:00.204) 0:10:14.709 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 18 March 2026 20:27:57 -0400 (0:00:00.173) 0:10:14.883 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 18 March 2026 20:27:57 -0400 (0:00:00.303) 0:10:15.187 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 18 March 2026 20:27:57 -0400 (0:00:00.273) 0:10:15.461 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 18 March 2026 20:27:57 -0400 (0:00:00.280) 0:10:15.741 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 18 March 2026 20:27:58 -0400 (0:00:00.309) 0:10:16.051 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 18 March 2026 20:27:58 -0400 (0:00:00.227) 0:10:16.279 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 18 March 2026 20:27:58 -0400 (0:00:00.284) 0:10:16.563 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 18 March 2026 20:27:58 -0400 (0:00:00.279) 0:10:16.843 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 18 March 2026 20:27:59 -0400 (0:00:00.298) 0:10:17.142 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 18 March 2026 20:27:59 -0400 (0:00:00.293) 0:10:17.435 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 18 March 2026 20:27:59 -0400 (0:00:00.255) 0:10:17.691 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 18 March 2026 20:28:00 -0400 (0:00:00.313) 0:10:18.005 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 18 March 2026 20:28:00 -0400 (0:00:00.326) 0:10:18.332 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 18 March 2026 20:28:00 -0400 (0:00:00.508) 0:10:18.841 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 18 March 2026 20:28:01 -0400 (0:00:00.340) 0:10:19.182 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 18 March 2026 20:28:01 -0400 (0:00:00.398) 0:10:19.580 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 18 March 2026 20:28:02 -0400 (0:00:00.319) 0:10:19.899 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 18 March 2026 20:28:02 -0400 (0:00:00.406) 0:10:20.306 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 18 March 2026 20:28:02 -0400 (0:00:00.367) 0:10:20.674 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 18 March 2026 20:28:03 -0400 (0:00:00.376) 0:10:21.051 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 18 March 2026 20:28:03 -0400 (0:00:00.337) 0:10:21.388 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 18 March 2026 20:28:03 -0400 (0:00:00.350) 0:10:21.739 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 18 March 2026 20:28:04 -0400 (0:00:00.361) 0:10:22.100 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 18 March 2026 20:28:04 -0400 (0:00:00.305) 0:10:22.406 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 18 March 2026 20:28:04 -0400 (0:00:00.316) 0:10:22.723 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 18 March 2026 20:28:05 -0400 (0:00:00.311) 0:10:23.034 ******* ok: [managed-node4] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 18 March 2026 20:28:05 -0400 (0:00:00.203) 0:10:23.238 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 18 March 2026 20:28:05 -0400 (0:00:00.311) 0:10:23.549 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 18 March 2026 20:28:05 -0400 (0:00:00.258) 0:10:23.808 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 18 March 2026 20:28:06 -0400 (0:00:00.218) 0:10:24.027 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 18 March 2026 20:28:06 -0400 (0:00:00.144) 0:10:24.171 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 18 March 2026 20:28:06 -0400 (0:00:00.195) 0:10:24.367 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 18 March 2026 20:28:06 -0400 (0:00:00.243) 0:10:24.610 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 18 March 2026 20:28:06 -0400 (0:00:00.209) 0:10:24.819 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 18 March 2026 20:28:07 -0400 (0:00:00.199) 0:10:25.019 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 18 March 2026 20:28:07 -0400 (0:00:00.263) 0:10:25.283 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 18 March 2026 20:28:07 -0400 (0:00:00.276) 0:10:25.559 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 18 March 2026 20:28:07 -0400 (0:00:00.296) 0:10:25.855 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 18 March 2026 20:28:08 -0400 (0:00:00.268) 0:10:26.124 ******* changed: [managed-node4] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 4] ****************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:290 Wednesday 18 March 2026 20:28:09 -0400 (0:00:01.313) 0:10:27.437 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node4 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 18 March 2026 20:28:10 -0400 (0:00:00.841) 0:10:28.278 ******* ok: [managed-node4] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 18 March 2026 20:28:10 -0400 (0:00:00.324) 0:10:28.602 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 TASK [Clear facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:10 Wednesday 18 March 2026 20:28:11 -0400 (0:00:00.394) 0:10:28.997 ******* META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:28:11 -0400 (0:00:00.032) 0:10:29.029 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:28:11 -0400 (0:00:00.375) 0:10:29.405 ******* included: fedora.linux_system_roles.storage for managed-node4 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:28:12 -0400 (0:00:00.587) 0:10:29.992 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:28:12 -0400 (0:00:00.372) 0:10:30.365 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:28:14 -0400 (0:00:02.101) 0:10:32.466 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:28:15 -0400 (0:00:00.951) 0:10:33.418 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:28:15 -0400 (0:00:00.323) 0:10:33.742 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:28:16 -0400 (0:00:00.315) 0:10:34.058 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:28:16 -0400 (0:00:00.196) 0:10:34.254 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:28:16 -0400 (0:00:00.203) 0:10:34.458 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:28:17 -0400 (0:00:00.861) 0:10:35.319 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:28:19 -0400 (0:00:02.112) 0:10:37.431 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:28:19 -0400 (0:00:00.381) 0:10:37.813 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:28:20 -0400 (0:00:00.355) 0:10:38.168 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:28:22 -0400 (0:00:02.432) 0:10:40.601 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:28:23 -0400 (0:00:00.549) 0:10:41.150 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:28:23 -0400 (0:00:00.209) 0:10:41.360 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:28:23 -0400 (0:00:00.338) 0:10:41.699 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:28:24 -0400 (0:00:00.247) 0:10:41.947 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:28:26 -0400 (0:00:02.073) 0:10:44.020 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service": { "name": "systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:28:29 -0400 (0:00:03.076) 0:10:47.098 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d37471199\x2d8d7b\x2d4b98\x2db054\x2ddea418ad7607.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service", "name": "systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket \"system-systemd\\\\x2dcryptsetup.slice\" systemd-udevd-kernel.socket dev-sda1.device cryptsetup-pre.target", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target \"blockdev@dev-mapper-luks\\\\x2d37471199\\\\x2d8d7b\\\\x2d4b98\\\\x2db054\\\\x2ddea418ad7607.target\"", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-37471199-8d7b-4b98-b054-dea418ad7607", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-37471199-8d7b-4b98-b054-dea418ad7607 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-37471199-8d7b-4b98-b054-dea418ad7607 /dev/sda1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-37471199-8d7b-4b98-b054-dea418ad7607 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-37471199-8d7b-4b98-b054-dea418ad7607 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d37471199\\\\x2d8d7b\\\\x2d4b98\\\\x2db054\\\\x2ddea418ad7607.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2026-03-18 20:26:28 EDT", "StateChangeTimestampMonotonic": "2046190522", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d37471199\\\\x2d8d7b\\\\x2d4b98\\\\x2db054\\\\x2ddea418ad7607.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:28:31 -0400 (0:00:02.213) 0:10:49.312 ******* fatal: [managed-node4]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 18 March 2026 20:28:33 -0400 (0:00:02.484) 0:10:51.796 ******* fatal: [managed-node4]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:28:34 -0400 (0:00:00.369) 0:10:52.165 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d37471199\x2d8d7b\x2d4b98\x2db054\x2ddea418ad7607.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service", "name": "systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454168576", "LimitMEMLOCKSoft": "454168576", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d37471199\\x2d8d7b\\x2d4b98\\x2db054\\x2ddea418ad7607.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d37471199\\\\x2d8d7b\\\\x2d4b98\\\\x2db054\\\\x2ddea418ad7607.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Wednesday 18 March 2026 20:28:36 -0400 (0:00:02.185) 0:10:54.351 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Wednesday 18 March 2026 20:28:36 -0400 (0:00:00.322) 0:10:54.674 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Wednesday 18 March 2026 20:28:37 -0400 (0:00:00.526) 0:10:55.200 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 18 March 2026 20:28:37 -0400 (0:00:00.370) 0:10:55.571 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880089.4174168, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1773880089.4174168, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1773880089.4174168, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3484468190", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 18 March 2026 20:28:38 -0400 (0:00:01.275) 0:10:56.846 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:317 Wednesday 18 March 2026 20:28:39 -0400 (0:00:00.421) 0:10:57.267 ******* ok: [managed-node4] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testggyp06_hlukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:324 Wednesday 18 March 2026 20:28:43 -0400 (0:00:04.429) 0:11:01.697 ******* ok: [managed-node4] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testggyp06_hlukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1773880124.225166-137004-137575307880017/.source", "state": "file", "uid": 0 } TASK [Add encryption to the volume - 2] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:331 Wednesday 18 March 2026 20:28:48 -0400 (0:00:04.991) 0:11:06.689 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 TASK [Clear facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:10 Wednesday 18 March 2026 20:28:49 -0400 (0:00:00.232) 0:11:06.922 ******* META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:28:49 -0400 (0:00:00.012) 0:11:06.938 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:28:49 -0400 (0:00:00.223) 0:11:07.162 ******* included: fedora.linux_system_roles.storage for managed-node4 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:28:49 -0400 (0:00:00.355) 0:11:07.517 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:28:50 -0400 (0:00:00.425) 0:11:07.942 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:28:54 -0400 (0:00:03.974) 0:11:11.916 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:28:54 -0400 (0:00:00.814) 0:11:12.731 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:28:55 -0400 (0:00:00.273) 0:11:13.010 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:28:55 -0400 (0:00:00.270) 0:11:13.281 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:28:55 -0400 (0:00:00.220) 0:11:13.502 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:28:55 -0400 (0:00:00.271) 0:11:13.773 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:28:56 -0400 (0:00:00.657) 0:11:14.430 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:28:58 -0400 (0:00:01.933) 0:11:16.364 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_testggyp06_hlukskey", "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:28:58 -0400 (0:00:00.392) 0:11:16.756 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:28:59 -0400 (0:00:00.232) 0:11:16.989 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:29:01 -0400 (0:00:02.203) 0:11:19.193 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:29:01 -0400 (0:00:00.418) 0:11:19.611 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:29:01 -0400 (0:00:00.156) 0:11:19.767 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:29:02 -0400 (0:00:00.327) 0:11:20.095 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:29:02 -0400 (0:00:00.214) 0:11:20.310 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:29:04 -0400 (0:00:02.008) 0:11:22.319 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:29:07 -0400 (0:00:02.791) 0:11:25.111 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:29:07 -0400 (0:00:00.458) 0:11:25.569 ******* changed: [managed-node4] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "password": "/tmp/storage_testggyp06_hlukskey", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testggyp06_hlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 20:29:18 -0400 (0:00:11.296) 0:11:36.865 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 20:29:19 -0400 (0:00:00.214) 0:11:37.080 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880024.636288, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ef3d9c6fd2bd0150e34dc9b1688802b6e68e17ac", "ctime": 1773880024.633288, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 171966659, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773880024.633288, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1478, "uid": 0, "version": "2477030431", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 20:29:20 -0400 (0:00:01.317) 0:11:38.398 ******* ok: [managed-node4] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:29:21 -0400 (0:00:01.459) 0:11:39.858 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 20:29:22 -0400 (0:00:00.490) 0:11:40.348 ******* ok: [managed-node4] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "password": "/tmp/storage_testggyp06_hlukskey", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testggyp06_hlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 20:29:22 -0400 (0:00:00.349) 0:11:40.697 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testggyp06_hlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 20:29:23 -0400 (0:00:00.445) 0:11:41.143 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 20:29:23 -0400 (0:00:00.358) 0:11:41.502 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node4] => (item={'src': 'UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=d4a68fbc-1ba8-41a5-96a0-c9be988b04d7" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 20:29:25 -0400 (0:00:01.648) 0:11:43.150 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 20:29:26 -0400 (0:00:01.569) 0:11:44.720 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node4] => (item={'src': '/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 20:29:28 -0400 (0:00:01.526) 0:11:46.246 ******* skipping: [managed-node4] => (item={'src': '/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 20:29:28 -0400 (0:00:00.454) 0:11:46.700 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 20:29:30 -0400 (0:00:01.670) 0:11:48.370 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880037.3083131, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1773880029.5322976, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 373293267, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1773880029.5332766, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "2801217999", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 20:29:31 -0400 (0:00:01.070) 0:11:49.442 ******* changed: [managed-node4] => (item={'backing_device': '/dev/sda1', 'name': 'luks-0c01a968-7a32-44e8-ab80-9164045a86b1', 'password': '/tmp/storage_testggyp06_hlukskey', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "password": "/tmp/storage_testggyp06_hlukskey", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 20:29:32 -0400 (0:00:01.284) 0:11:50.726 ******* ok: [managed-node4] TASK [Verify role results - 6] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:348 Wednesday 18 March 2026 20:29:34 -0400 (0:00:01.848) 0:11:52.574 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node4 TASK [Print out pool information] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 18 March 2026 20:29:35 -0400 (0:00:00.369) 0:11:52.943 ******* ok: [managed-node4] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testggyp06_hlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 18 March 2026 20:29:35 -0400 (0:00:00.378) 0:11:53.322 ******* skipping: [managed-node4] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 18 March 2026 20:29:35 -0400 (0:00:00.248) 0:11:53.571 ******* ok: [managed-node4] => { "changed": false, "info": { "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "size": "4G", "type": "crypt", "uuid": "33acce6f-e36c-4030-85da-ff247dcf7e89" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "0c01a968-7a32-44e8-ab80-9164045a86b1" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "63c9c93d-438e-4c50-886b-6f947888611e" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 18 March 2026 20:29:36 -0400 (0:00:01.243) 0:11:54.814 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002987", "end": "2026-03-18 20:29:38.021706", "rc": 0, "start": "2026-03-18 20:29:38.018719" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Mar 10 07:51:51 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=63c9c93d-438e-4c50-886b-6f947888611e / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 18 March 2026 20:29:38 -0400 (0:00:01.276) 0:11:56.091 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003069", "end": "2026-03-18 20:29:39.083765", "failed_when_result": false, "rc": 0, "start": "2026-03-18 20:29:39.080696" } STDOUT: luks-0c01a968-7a32-44e8-ab80-9164045a86b1 /dev/sda1 /tmp/storage_testggyp06_hlukskey TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 18 March 2026 20:29:39 -0400 (0:00:01.049) 0:11:57.140 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node4 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testggyp06_hlukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 18 March 2026 20:29:39 -0400 (0:00:00.382) 0:11:57.523 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 18 March 2026 20:29:39 -0400 (0:00:00.195) 0:11:57.719 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 18 March 2026 20:29:39 -0400 (0:00:00.157) 0:11:57.906 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 18 March 2026 20:29:40 -0400 (0:00:00.177) 0:11:58.084 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node4 => (item=members) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node4 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 18 March 2026 20:29:40 -0400 (0:00:00.566) 0:11:58.651 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 18 March 2026 20:29:40 -0400 (0:00:00.209) 0:11:58.860 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 18 March 2026 20:29:41 -0400 (0:00:00.170) 0:11:59.031 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 18 March 2026 20:29:41 -0400 (0:00:00.221) 0:11:59.253 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 18 March 2026 20:29:41 -0400 (0:00:00.205) 0:11:59.458 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 18 March 2026 20:29:41 -0400 (0:00:00.233) 0:11:59.703 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 18 March 2026 20:29:41 -0400 (0:00:00.168) 0:11:59.872 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 18 March 2026 20:29:42 -0400 (0:00:00.209) 0:12:00.084 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Wednesday 18 March 2026 20:29:42 -0400 (0:00:00.203) 0:12:00.288 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Wednesday 18 March 2026 20:29:42 -0400 (0:00:00.161) 0:12:00.449 ******* ok: [managed-node4] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:91268): WARNING **: 20:29:43.545: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.5 27 Jan 2026 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.221 originally 10.31.13.221 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.221 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.221 originally 10.31.13.221 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e0f745688c' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.221 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Wednesday 18 March 2026 20:29:43 -0400 (0:00:01.252) 0:12:01.723 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Wednesday 18 March 2026 20:29:43 -0400 (0:00:00.143) 0:12:01.866 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node4 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 18 March 2026 20:29:44 -0400 (0:00:00.492) 0:12:02.358 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 18 March 2026 20:29:44 -0400 (0:00:00.156) 0:12:02.529 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 18 March 2026 20:29:44 -0400 (0:00:00.223) 0:12:02.756 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 18 March 2026 20:29:45 -0400 (0:00:00.149) 0:12:02.905 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 18 March 2026 20:29:45 -0400 (0:00:00.220) 0:12:03.125 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 18 March 2026 20:29:45 -0400 (0:00:00.149) 0:12:03.275 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 18 March 2026 20:29:45 -0400 (0:00:00.138) 0:12:03.414 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 18 March 2026 20:29:45 -0400 (0:00:00.182) 0:12:03.596 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 18 March 2026 20:29:45 -0400 (0:00:00.131) 0:12:03.728 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 18 March 2026 20:29:46 -0400 (0:00:00.175) 0:12:03.903 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 18 March 2026 20:29:46 -0400 (0:00:00.224) 0:12:04.128 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Wednesday 18 March 2026 20:29:46 -0400 (0:00:00.210) 0:12:04.339 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node4 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 18 March 2026 20:29:46 -0400 (0:00:00.433) 0:12:04.772 ******* skipping: [managed-node4] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testggyp06_hlukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testggyp06_hlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Wednesday 18 March 2026 20:29:47 -0400 (0:00:00.313) 0:12:05.086 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node4 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 18 March 2026 20:29:47 -0400 (0:00:00.579) 0:12:05.665 ******* skipping: [managed-node4] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testggyp06_hlukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testggyp06_hlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Wednesday 18 March 2026 20:29:48 -0400 (0:00:00.265) 0:12:05.930 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node4 TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 18 March 2026 20:29:48 -0400 (0:00:00.583) 0:12:06.514 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 18 March 2026 20:29:48 -0400 (0:00:00.260) 0:12:06.775 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 18 March 2026 20:29:49 -0400 (0:00:00.238) 0:12:07.013 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 18 March 2026 20:29:49 -0400 (0:00:00.212) 0:12:07.226 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Wednesday 18 March 2026 20:29:49 -0400 (0:00:00.263) 0:12:07.489 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node4 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 18 March 2026 20:29:50 -0400 (0:00:00.716) 0:12:08.205 ******* skipping: [managed-node4] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testggyp06_hlukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_testggyp06_hlukskey", "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Wednesday 18 March 2026 20:29:50 -0400 (0:00:00.359) 0:12:08.565 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node4 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 18 March 2026 20:29:51 -0400 (0:00:00.717) 0:12:09.282 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 18 March 2026 20:29:51 -0400 (0:00:00.207) 0:12:09.490 ******* skipping: [managed-node4] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Wednesday 18 March 2026 20:29:51 -0400 (0:00:00.174) 0:12:09.664 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Wednesday 18 March 2026 20:29:51 -0400 (0:00:00.185) 0:12:09.849 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Wednesday 18 March 2026 20:29:52 -0400 (0:00:00.166) 0:12:10.016 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Wednesday 18 March 2026 20:29:52 -0400 (0:00:00.304) 0:12:10.320 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Wednesday 18 March 2026 20:29:52 -0400 (0:00:00.178) 0:12:10.498 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Wednesday 18 March 2026 20:29:52 -0400 (0:00:00.264) 0:12:10.762 ******* ok: [managed-node4] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 18 March 2026 20:29:53 -0400 (0:00:00.263) 0:12:11.026 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node4 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_testggyp06_hlukskey', 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 18 March 2026 20:29:53 -0400 (0:00:00.399) 0:12:11.425 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 18 March 2026 20:29:53 -0400 (0:00:00.321) 0:12:11.747 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node4 => (item=mount) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node4 => (item=fstab) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node4 => (item=fs) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node4 => (item=device) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node4 => (item=encryption) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node4 => (item=md) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node4 => (item=size) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node4 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 18 March 2026 20:29:55 -0400 (0:00:01.678) 0:12:13.425 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 18 March 2026 20:29:55 -0400 (0:00:00.359) 0:12:13.784 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 18 March 2026 20:29:56 -0400 (0:00:00.263) 0:12:14.048 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 18 March 2026 20:29:56 -0400 (0:00:00.404) 0:12:14.452 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 18 March 2026 20:29:56 -0400 (0:00:00.348) 0:12:14.801 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 18 March 2026 20:29:57 -0400 (0:00:00.288) 0:12:15.090 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 18 March 2026 20:29:57 -0400 (0:00:00.229) 0:12:15.319 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 18 March 2026 20:29:57 -0400 (0:00:00.261) 0:12:15.581 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 18 March 2026 20:29:57 -0400 (0:00:00.206) 0:12:15.787 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 18 March 2026 20:29:58 -0400 (0:00:00.221) 0:12:16.008 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 18 March 2026 20:29:58 -0400 (0:00:00.177) 0:12:16.186 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 18 March 2026 20:29:58 -0400 (0:00:00.300) 0:12:16.487 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 18 March 2026 20:29:59 -0400 (0:00:00.727) 0:12:17.214 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 18 March 2026 20:29:59 -0400 (0:00:00.367) 0:12:17.582 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 18 March 2026 20:30:00 -0400 (0:00:00.311) 0:12:17.893 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 18 March 2026 20:30:00 -0400 (0:00:00.335) 0:12:18.228 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 18 March 2026 20:30:00 -0400 (0:00:00.334) 0:12:18.562 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 18 March 2026 20:30:00 -0400 (0:00:00.227) 0:12:18.790 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 18 March 2026 20:30:01 -0400 (0:00:00.340) 0:12:19.130 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 18 March 2026 20:30:01 -0400 (0:00:00.526) 0:12:19.657 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880158.2065535, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1773880158.2065535, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2020, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1773880158.2065535, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 18 March 2026 20:30:03 -0400 (0:00:02.019) 0:12:21.677 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 18 March 2026 20:30:04 -0400 (0:00:00.330) 0:12:22.007 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 18 March 2026 20:30:04 -0400 (0:00:00.206) 0:12:22.214 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 18 March 2026 20:30:04 -0400 (0:00:00.260) 0:12:22.475 ******* ok: [managed-node4] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 18 March 2026 20:30:04 -0400 (0:00:00.187) 0:12:22.662 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 18 March 2026 20:30:04 -0400 (0:00:00.165) 0:12:22.828 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 18 March 2026 20:30:05 -0400 (0:00:00.301) 0:12:23.129 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880158.6815543, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1773880158.6815543, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2058, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1773880158.6815543, "nlink": 1, "path": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 18 March 2026 20:30:06 -0400 (0:00:01.223) 0:12:24.352 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 18 March 2026 20:30:08 -0400 (0:00:01.930) 0:12:26.283 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.007033", "end": "2026-03-18 20:30:09.430870", "rc": 0, "start": "2026-03-18 20:30:09.423837" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 0c01a968-7a32-44e8-ab80-9164045a86b1 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 686240 Threads: 2 Salt: c3 67 40 c8 88 a2 cd fb ec 2d 36 ef 52 2a 10 8b de 71 81 ea f0 e5 5f b1 7b 55 2a fb 1c 91 d1 fb AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 130549 Salt: fa 47 4b fb 5a 77 2c 5b 0d f4 65 f0 34 d2 7c 92 75 6b d0 82 d3 d0 03 40 10 3d 83 74 8c fd 0d ed Digest: 8b a0 2a ce 48 59 4c 9a 69 e3 3a df ca ea f9 8a 32 92 cf 0f 58 cb f0 99 50 82 25 bb 2a 3f e9 92 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 18 March 2026 20:30:09 -0400 (0:00:01.220) 0:12:27.503 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 18 March 2026 20:30:09 -0400 (0:00:00.354) 0:12:27.858 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 18 March 2026 20:30:10 -0400 (0:00:00.423) 0:12:28.281 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 18 March 2026 20:30:10 -0400 (0:00:00.424) 0:12:28.706 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 18 March 2026 20:30:11 -0400 (0:00:00.348) 0:12:29.054 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 18 March 2026 20:30:11 -0400 (0:00:00.538) 0:12:29.593 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 18 March 2026 20:30:12 -0400 (0:00:00.312) 0:12:29.906 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 18 March 2026 20:30:12 -0400 (0:00:00.316) 0:12:30.223 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-0c01a968-7a32-44e8-ab80-9164045a86b1 /dev/sda1 /tmp/storage_testggyp06_hlukskey" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "/tmp/storage_testggyp06_hlukskey" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 18 March 2026 20:30:12 -0400 (0:00:00.519) 0:12:30.743 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 18 March 2026 20:30:13 -0400 (0:00:00.221) 0:12:30.964 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 18 March 2026 20:30:13 -0400 (0:00:00.434) 0:12:31.399 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 18 March 2026 20:30:13 -0400 (0:00:00.369) 0:12:31.769 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 18 March 2026 20:30:14 -0400 (0:00:00.434) 0:12:32.204 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 18 March 2026 20:30:14 -0400 (0:00:00.241) 0:12:32.445 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 18 March 2026 20:30:14 -0400 (0:00:00.182) 0:12:32.628 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 18 March 2026 20:30:15 -0400 (0:00:00.283) 0:12:32.932 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 18 March 2026 20:30:15 -0400 (0:00:00.298) 0:12:33.230 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 18 March 2026 20:30:15 -0400 (0:00:00.263) 0:12:33.494 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 18 March 2026 20:30:15 -0400 (0:00:00.225) 0:12:33.719 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 18 March 2026 20:30:16 -0400 (0:00:00.265) 0:12:33.984 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 18 March 2026 20:30:16 -0400 (0:00:00.250) 0:12:34.234 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 18 March 2026 20:30:16 -0400 (0:00:00.276) 0:12:34.511 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 18 March 2026 20:30:16 -0400 (0:00:00.277) 0:12:34.789 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 18 March 2026 20:30:17 -0400 (0:00:00.263) 0:12:35.052 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 18 March 2026 20:30:17 -0400 (0:00:00.282) 0:12:35.334 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 18 March 2026 20:30:17 -0400 (0:00:00.166) 0:12:35.500 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 18 March 2026 20:30:17 -0400 (0:00:00.223) 0:12:35.724 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 18 March 2026 20:30:18 -0400 (0:00:00.320) 0:12:36.045 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 18 March 2026 20:30:18 -0400 (0:00:00.285) 0:12:36.330 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 18 March 2026 20:30:18 -0400 (0:00:00.319) 0:12:36.650 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 18 March 2026 20:30:18 -0400 (0:00:00.227) 0:12:36.878 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 18 March 2026 20:30:19 -0400 (0:00:00.237) 0:12:37.116 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 18 March 2026 20:30:19 -0400 (0:00:00.276) 0:12:37.393 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 18 March 2026 20:30:19 -0400 (0:00:00.401) 0:12:37.794 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 18 March 2026 20:30:20 -0400 (0:00:00.410) 0:12:38.205 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 18 March 2026 20:30:20 -0400 (0:00:00.327) 0:12:38.533 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 18 March 2026 20:30:20 -0400 (0:00:00.292) 0:12:38.825 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 18 March 2026 20:30:21 -0400 (0:00:00.311) 0:12:39.137 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 18 March 2026 20:30:21 -0400 (0:00:00.333) 0:12:39.471 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 18 March 2026 20:30:22 -0400 (0:00:00.474) 0:12:39.945 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 18 March 2026 20:30:22 -0400 (0:00:00.360) 0:12:40.305 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 18 March 2026 20:30:22 -0400 (0:00:00.393) 0:12:40.698 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 18 March 2026 20:30:23 -0400 (0:00:00.437) 0:12:41.136 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 18 March 2026 20:30:23 -0400 (0:00:00.331) 0:12:41.467 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 18 March 2026 20:30:23 -0400 (0:00:00.306) 0:12:41.774 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 18 March 2026 20:30:24 -0400 (0:00:00.378) 0:12:42.152 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 18 March 2026 20:30:24 -0400 (0:00:00.360) 0:12:42.512 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 18 March 2026 20:30:24 -0400 (0:00:00.303) 0:12:42.816 ******* ok: [managed-node4] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 18 March 2026 20:30:25 -0400 (0:00:00.274) 0:12:43.091 ******* ok: [managed-node4] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 18 March 2026 20:30:25 -0400 (0:00:00.255) 0:12:43.346 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 18 March 2026 20:30:25 -0400 (0:00:00.202) 0:12:43.549 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 18 March 2026 20:30:25 -0400 (0:00:00.313) 0:12:43.862 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 18 March 2026 20:30:26 -0400 (0:00:00.257) 0:12:44.120 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 18 March 2026 20:30:26 -0400 (0:00:00.243) 0:12:44.364 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 18 March 2026 20:30:26 -0400 (0:00:00.258) 0:12:44.623 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 18 March 2026 20:30:26 -0400 (0:00:00.248) 0:12:44.872 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 18 March 2026 20:30:27 -0400 (0:00:00.221) 0:12:45.093 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 18 March 2026 20:30:27 -0400 (0:00:00.295) 0:12:45.389 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 18 March 2026 20:30:27 -0400 (0:00:00.210) 0:12:45.600 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 18 March 2026 20:30:27 -0400 (0:00:00.267) 0:12:45.867 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:351 Wednesday 18 March 2026 20:30:28 -0400 (0:00:00.233) 0:12:46.101 ******* ok: [managed-node4] => { "changed": false, "path": "/tmp/storage_testggyp06_hlukskey", "state": "absent" } TASK [Test for correct handling of new encrypted volume w/ no key - 3] ********* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:361 Wednesday 18 March 2026 20:30:29 -0400 (0:00:01.418) 0:12:47.520 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node4 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 18 March 2026 20:30:30 -0400 (0:00:00.592) 0:12:48.113 ******* ok: [managed-node4] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 18 March 2026 20:30:30 -0400 (0:00:00.400) 0:12:48.513 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 TASK [Clear facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:10 Wednesday 18 March 2026 20:30:31 -0400 (0:00:00.407) 0:12:48.921 ******* META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:30:31 -0400 (0:00:00.003) 0:12:48.924 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:30:31 -0400 (0:00:00.221) 0:12:49.146 ******* included: fedora.linux_system_roles.storage for managed-node4 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:30:31 -0400 (0:00:00.419) 0:12:49.566 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:30:31 -0400 (0:00:00.312) 0:12:49.879 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:30:34 -0400 (0:00:02.073) 0:12:51.952 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:30:34 -0400 (0:00:00.692) 0:12:52.645 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:30:34 -0400 (0:00:00.204) 0:12:52.849 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:30:35 -0400 (0:00:00.258) 0:12:53.108 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:30:35 -0400 (0:00:00.286) 0:12:53.395 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:30:35 -0400 (0:00:00.277) 0:12:53.672 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:30:36 -0400 (0:00:00.848) 0:12:54.521 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:30:38 -0400 (0:00:02.035) 0:12:56.556 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:30:39 -0400 (0:00:00.380) 0:12:56.937 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:30:39 -0400 (0:00:00.357) 0:12:57.295 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:30:41 -0400 (0:00:02.151) 0:12:59.446 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:30:41 -0400 (0:00:00.434) 0:12:59.881 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:30:42 -0400 (0:00:00.228) 0:13:00.110 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:30:42 -0400 (0:00:00.283) 0:13:00.393 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:30:42 -0400 (0:00:00.158) 0:13:00.552 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:30:44 -0400 (0:00:02.051) 0:13:02.604 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:30:47 -0400 (0:00:03.013) 0:13:05.618 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:30:48 -0400 (0:00:00.607) 0:13:06.225 ******* fatal: [managed-node4]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 18 March 2026 20:30:50 -0400 (0:00:02.488) 0:13:08.714 ******* fatal: [managed-node4]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "encrypted volume 'test1' missing key/password", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:30:51 -0400 (0:00:00.470) 0:13:09.185 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Wednesday 18 March 2026 20:30:51 -0400 (0:00:00.596) 0:13:09.781 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Wednesday 18 March 2026 20:30:52 -0400 (0:00:00.333) 0:13:10.115 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Wednesday 18 March 2026 20:30:52 -0400 (0:00:00.573) 0:13:10.690 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:380 Wednesday 18 March 2026 20:30:53 -0400 (0:00:00.365) 0:13:11.056 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 TASK [Clear facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:10 Wednesday 18 March 2026 20:30:53 -0400 (0:00:00.434) 0:13:11.490 ******* META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:30:53 -0400 (0:00:00.029) 0:13:11.520 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:30:53 -0400 (0:00:00.277) 0:13:11.797 ******* included: fedora.linux_system_roles.storage for managed-node4 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:30:54 -0400 (0:00:00.570) 0:13:12.368 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:30:54 -0400 (0:00:00.408) 0:13:12.777 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:30:57 -0400 (0:00:02.233) 0:13:15.010 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:30:57 -0400 (0:00:00.707) 0:13:15.717 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:30:58 -0400 (0:00:00.255) 0:13:15.973 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:30:58 -0400 (0:00:00.236) 0:13:16.209 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:30:58 -0400 (0:00:00.259) 0:13:16.469 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:30:58 -0400 (0:00:00.256) 0:13:16.725 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:30:59 -0400 (0:00:00.856) 0:13:17.582 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:31:01 -0400 (0:00:02.042) 0:13:19.625 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:31:02 -0400 (0:00:00.499) 0:13:20.124 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:31:02 -0400 (0:00:00.319) 0:13:20.444 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:31:05 -0400 (0:00:02.622) 0:13:23.067 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:31:05 -0400 (0:00:00.500) 0:13:23.567 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:31:05 -0400 (0:00:00.209) 0:13:23.785 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:31:06 -0400 (0:00:00.319) 0:13:24.105 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:31:06 -0400 (0:00:00.239) 0:13:24.345 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:31:08 -0400 (0:00:02.044) 0:13:26.389 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:31:12 -0400 (0:00:03.931) 0:13:30.320 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:31:12 -0400 (0:00:00.521) 0:13:30.842 ******* changed: [managed-node4] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 20:31:26 -0400 (0:00:13.199) 0:13:44.041 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 20:31:26 -0400 (0:00:00.233) 0:13:44.275 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880168.2235732, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "cb6fe76d6b6291a4f0bc966f5b3967a191d17502", "ctime": 1773880168.2205732, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 171966659, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773880168.2205732, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1490, "uid": 0, "version": "2477030431", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 20:31:27 -0400 (0:00:01.309) 0:13:45.585 ******* ok: [managed-node4] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:31:29 -0400 (0:00:01.375) 0:13:46.960 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 20:31:29 -0400 (0:00:00.547) 0:13:47.508 ******* ok: [managed-node4] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "state": "mounted" } ], "packages": [ "lvm2", "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 20:31:30 -0400 (0:00:00.526) 0:13:48.035 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 20:31:30 -0400 (0:00:00.403) 0:13:48.439 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 20:31:30 -0400 (0:00:00.208) 0:13:48.647 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node4] => (item={'src': '/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-0c01a968-7a32-44e8-ab80-9164045a86b1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 20:31:32 -0400 (0:00:01.518) 0:13:50.166 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 20:31:33 -0400 (0:00:01.672) 0:13:51.838 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node4] => (item={'src': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 20:31:35 -0400 (0:00:01.357) 0:13:53.196 ******* skipping: [managed-node4] => (item={'src': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 20:31:35 -0400 (0:00:00.412) 0:13:53.608 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 20:31:37 -0400 (0:00:01.593) 0:13:55.201 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880179.0825949, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "123140b2a05861ab38f475e3caf954223b8b98a3", "ctime": 1773880172.7305822, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 654311760, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1773880172.7311573, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 85, "uid": 0, "version": "3089884991", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 20:31:38 -0400 (0:00:01.243) 0:13:56.445 ******* changed: [managed-node4] => (item={'backing_device': '/dev/sda1', 'name': 'luks-0c01a968-7a32-44e8-ab80-9164045a86b1', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node4] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 20:31:41 -0400 (0:00:02.695) 0:13:59.141 ******* ok: [managed-node4] TASK [Verify role results - 7] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:398 Wednesday 18 March 2026 20:31:43 -0400 (0:00:01.868) 0:14:01.009 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node4 TASK [Print out pool information] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 18 March 2026 20:31:43 -0400 (0:00:00.478) 0:14:01.487 ******* ok: [managed-node4] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 18 March 2026 20:31:44 -0400 (0:00:00.467) 0:14:01.954 ******* skipping: [managed-node4] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 18 March 2026 20:31:44 -0400 (0:00:00.275) 0:14:02.230 ******* ok: [managed-node4] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "e3dc7518-65da-4f95-b3a9-fe385f43a1b3" }, "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "size": "4G", "type": "crypt", "uuid": "f73ca6b2-ae10-46a8-8f70-81fdeeeee1d7" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "67DFfz-RcXl-0Z5W-fL92-nvtr-VU62-rytR2b" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "63c9c93d-438e-4c50-886b-6f947888611e" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 18 March 2026 20:31:45 -0400 (0:00:01.331) 0:14:03.561 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003465", "end": "2026-03-18 20:31:46.751804", "rc": 0, "start": "2026-03-18 20:31:46.748339" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Mar 10 07:51:51 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=63c9c93d-438e-4c50-886b-6f947888611e / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 18 March 2026 20:31:46 -0400 (0:00:01.228) 0:14:04.790 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002861", "end": "2026-03-18 20:31:47.911350", "failed_when_result": false, "rc": 0, "start": "2026-03-18 20:31:47.908489" } STDOUT: luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 18 March 2026 20:31:48 -0400 (0:00:01.287) 0:14:06.077 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node4 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 18 March 2026 20:31:48 -0400 (0:00:00.571) 0:14:06.649 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 18 March 2026 20:31:48 -0400 (0:00:00.217) 0:14:06.866 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.030058", "end": "2026-03-18 20:31:50.104111", "rc": 0, "start": "2026-03-18 20:31:50.074053" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 18 March 2026 20:31:50 -0400 (0:00:01.307) 0:14:08.174 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 18 March 2026 20:31:50 -0400 (0:00:00.396) 0:14:08.570 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node4 => (item=members) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node4 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 18 March 2026 20:31:51 -0400 (0:00:00.540) 0:14:09.111 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 18 March 2026 20:31:51 -0400 (0:00:00.507) 0:14:09.619 ******* ok: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 18 March 2026 20:31:56 -0400 (0:00:04.385) 0:14:14.004 ******* ok: [managed-node4] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 18 March 2026 20:31:57 -0400 (0:00:01.624) 0:14:15.629 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 18 March 2026 20:31:58 -0400 (0:00:00.399) 0:14:16.028 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 18 March 2026 20:31:58 -0400 (0:00:00.406) 0:14:16.435 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 18 March 2026 20:31:58 -0400 (0:00:00.324) 0:14:16.760 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 18 March 2026 20:31:59 -0400 (0:00:00.332) 0:14:17.092 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Wednesday 18 March 2026 20:31:59 -0400 (0:00:00.317) 0:14:17.410 ******* ok: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Wednesday 18 March 2026 20:31:59 -0400 (0:00:00.394) 0:14:17.805 ******* ok: [managed-node4] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:97248): WARNING **: 20:32:01.076: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.5 27 Jan 2026 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.221 originally 10.31.13.221 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.221 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.221 originally 10.31.13.221 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e0f745688c' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.221 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Wednesday 18 March 2026 20:32:01 -0400 (0:00:01.409) 0:14:19.214 ******* skipping: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "storage_test_pool.grow_to_fill | bool", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Wednesday 18 March 2026 20:32:01 -0400 (0:00:00.306) 0:14:19.521 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node4 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 18 March 2026 20:32:02 -0400 (0:00:00.639) 0:14:20.161 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 18 March 2026 20:32:02 -0400 (0:00:00.379) 0:14:20.541 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 18 March 2026 20:32:02 -0400 (0:00:00.186) 0:14:20.727 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 18 March 2026 20:32:03 -0400 (0:00:00.198) 0:14:20.926 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 18 March 2026 20:32:03 -0400 (0:00:00.193) 0:14:21.119 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 18 March 2026 20:32:03 -0400 (0:00:00.268) 0:14:21.387 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 18 March 2026 20:32:03 -0400 (0:00:00.243) 0:14:21.631 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 18 March 2026 20:32:03 -0400 (0:00:00.240) 0:14:21.872 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 18 March 2026 20:32:04 -0400 (0:00:00.297) 0:14:22.169 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 18 March 2026 20:32:04 -0400 (0:00:00.223) 0:14:22.393 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 18 March 2026 20:32:04 -0400 (0:00:00.288) 0:14:22.682 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Wednesday 18 March 2026 20:32:05 -0400 (0:00:00.291) 0:14:22.973 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node4 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 18 March 2026 20:32:05 -0400 (0:00:00.728) 0:14:23.702 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node4 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Wednesday 18 March 2026 20:32:06 -0400 (0:00:00.675) 0:14:24.377 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Wednesday 18 March 2026 20:32:06 -0400 (0:00:00.288) 0:14:24.666 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Wednesday 18 March 2026 20:32:07 -0400 (0:00:00.307) 0:14:24.973 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Wednesday 18 March 2026 20:32:07 -0400 (0:00:00.194) 0:14:25.168 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Wednesday 18 March 2026 20:32:07 -0400 (0:00:00.217) 0:14:25.385 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Wednesday 18 March 2026 20:32:07 -0400 (0:00:00.342) 0:14:25.728 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Wednesday 18 March 2026 20:32:08 -0400 (0:00:00.275) 0:14:26.004 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Wednesday 18 March 2026 20:32:08 -0400 (0:00:00.284) 0:14:26.288 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node4 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 18 March 2026 20:32:09 -0400 (0:00:00.759) 0:14:27.047 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node4 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Wednesday 18 March 2026 20:32:09 -0400 (0:00:00.488) 0:14:27.536 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Wednesday 18 March 2026 20:32:09 -0400 (0:00:00.225) 0:14:27.762 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Wednesday 18 March 2026 20:32:10 -0400 (0:00:00.247) 0:14:28.009 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Wednesday 18 March 2026 20:32:10 -0400 (0:00:00.274) 0:14:28.284 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Wednesday 18 March 2026 20:32:10 -0400 (0:00:00.228) 0:14:28.513 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node4 TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 18 March 2026 20:32:11 -0400 (0:00:00.741) 0:14:29.254 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 18 March 2026 20:32:11 -0400 (0:00:00.313) 0:14:29.567 ******* skipping: [managed-node4] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 18 March 2026 20:32:11 -0400 (0:00:00.287) 0:14:29.855 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node4 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Wednesday 18 March 2026 20:32:12 -0400 (0:00:00.463) 0:14:30.319 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Wednesday 18 March 2026 20:32:12 -0400 (0:00:00.377) 0:14:30.697 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Wednesday 18 March 2026 20:32:13 -0400 (0:00:00.356) 0:14:31.053 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Wednesday 18 March 2026 20:32:13 -0400 (0:00:00.288) 0:14:31.342 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Wednesday 18 March 2026 20:32:13 -0400 (0:00:00.305) 0:14:31.648 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Wednesday 18 March 2026 20:32:14 -0400 (0:00:00.311) 0:14:31.960 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 18 March 2026 20:32:14 -0400 (0:00:00.249) 0:14:32.209 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Wednesday 18 March 2026 20:32:14 -0400 (0:00:00.352) 0:14:32.562 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node4 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 18 March 2026 20:32:15 -0400 (0:00:00.719) 0:14:33.281 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node4 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Wednesday 18 March 2026 20:32:15 -0400 (0:00:00.482) 0:14:33.764 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Wednesday 18 March 2026 20:32:16 -0400 (0:00:00.233) 0:14:33.997 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Wednesday 18 March 2026 20:32:16 -0400 (0:00:00.137) 0:14:34.135 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Wednesday 18 March 2026 20:32:16 -0400 (0:00:00.202) 0:14:34.337 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Wednesday 18 March 2026 20:32:16 -0400 (0:00:00.200) 0:14:34.537 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Wednesday 18 March 2026 20:32:16 -0400 (0:00:00.253) 0:14:34.791 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Wednesday 18 March 2026 20:32:17 -0400 (0:00:00.367) 0:14:35.159 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Wednesday 18 March 2026 20:32:17 -0400 (0:00:00.257) 0:14:35.416 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node4 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 18 March 2026 20:32:18 -0400 (0:00:00.705) 0:14:36.122 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 18 March 2026 20:32:18 -0400 (0:00:00.234) 0:14:36.357 ******* skipping: [managed-node4] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Wednesday 18 March 2026 20:32:18 -0400 (0:00:00.263) 0:14:36.620 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Wednesday 18 March 2026 20:32:18 -0400 (0:00:00.220) 0:14:36.841 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Wednesday 18 March 2026 20:32:19 -0400 (0:00:00.159) 0:14:37.000 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Wednesday 18 March 2026 20:32:19 -0400 (0:00:00.314) 0:14:37.315 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Wednesday 18 March 2026 20:32:19 -0400 (0:00:00.144) 0:14:37.460 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Wednesday 18 March 2026 20:32:19 -0400 (0:00:00.241) 0:14:37.701 ******* ok: [managed-node4] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 18 March 2026 20:32:20 -0400 (0:00:00.338) 0:14:38.039 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node4 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 18 March 2026 20:32:20 -0400 (0:00:00.492) 0:14:38.532 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 18 March 2026 20:32:20 -0400 (0:00:00.297) 0:14:38.829 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node4 => (item=mount) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node4 => (item=fstab) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node4 => (item=fs) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node4 => (item=device) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node4 => (item=encryption) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node4 => (item=md) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node4 => (item=size) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node4 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 18 March 2026 20:32:23 -0400 (0:00:02.250) 0:14:41.080 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 18 March 2026 20:32:23 -0400 (0:00:00.400) 0:14:41.481 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 18 March 2026 20:32:24 -0400 (0:00:00.434) 0:14:41.915 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 18 March 2026 20:32:24 -0400 (0:00:00.428) 0:14:42.344 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 18 March 2026 20:32:24 -0400 (0:00:00.348) 0:14:42.692 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 18 March 2026 20:32:25 -0400 (0:00:00.395) 0:14:43.087 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 18 March 2026 20:32:25 -0400 (0:00:00.420) 0:14:43.508 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 18 March 2026 20:32:26 -0400 (0:00:00.444) 0:14:43.953 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 18 March 2026 20:32:26 -0400 (0:00:00.228) 0:14:44.181 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 18 March 2026 20:32:26 -0400 (0:00:00.277) 0:14:44.459 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 18 March 2026 20:32:26 -0400 (0:00:00.240) 0:14:44.699 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 18 March 2026 20:32:27 -0400 (0:00:00.253) 0:14:44.952 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 18 March 2026 20:32:27 -0400 (0:00:00.677) 0:14:45.657 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 18 March 2026 20:32:28 -0400 (0:00:00.372) 0:14:46.029 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 18 March 2026 20:32:28 -0400 (0:00:00.442) 0:14:46.472 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 18 March 2026 20:32:28 -0400 (0:00:00.225) 0:14:46.698 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 18 March 2026 20:32:29 -0400 (0:00:00.273) 0:14:46.971 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 18 March 2026 20:32:29 -0400 (0:00:00.177) 0:14:47.149 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 18 March 2026 20:32:29 -0400 (0:00:00.334) 0:14:47.484 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 18 March 2026 20:32:29 -0400 (0:00:00.146) 0:14:47.630 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880285.3588076, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1773880285.3588076, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2298, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1773880285.3588076, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 18 March 2026 20:32:30 -0400 (0:00:01.249) 0:14:48.879 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 18 March 2026 20:32:31 -0400 (0:00:00.324) 0:14:49.204 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 18 March 2026 20:32:31 -0400 (0:00:00.304) 0:14:49.508 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 18 March 2026 20:32:31 -0400 (0:00:00.365) 0:14:49.874 ******* ok: [managed-node4] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 18 March 2026 20:32:32 -0400 (0:00:00.289) 0:14:50.164 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 18 March 2026 20:32:32 -0400 (0:00:00.265) 0:14:50.429 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 18 March 2026 20:32:32 -0400 (0:00:00.355) 0:14:50.785 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880285.8818085, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1773880285.8818085, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2346, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1773880285.8818085, "nlink": 1, "path": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 18 March 2026 20:32:34 -0400 (0:00:01.503) 0:14:52.288 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 18 March 2026 20:32:36 -0400 (0:00:01.961) 0:14:54.250 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.007021", "end": "2026-03-18 20:32:37.521003", "rc": 0, "start": "2026-03-18 20:32:37.513982" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: e3dc7518-65da-4f95-b3a9-fe385f43a1b3 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 687099 Threads: 2 Salt: b4 a0 df e2 6e f0 20 d7 e5 17 4d b1 5e 21 a8 a0 82 0e cc 9c 02 d6 21 c6 65 fc c2 50 df cc f3 7b AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 131730 Salt: 1b 12 51 0d 4a 2b 15 79 43 9c e4 29 1a ca bf 06 16 d3 25 98 7f 96 2d 51 0b 55 19 13 48 b5 99 d0 Digest: 4b fa ca 4c c8 cc 74 f9 ff ba 2a 68 13 bd 61 94 5d 55 61 86 d9 c4 a1 4a e2 28 0b 50 b5 1b 88 b9 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 18 March 2026 20:32:37 -0400 (0:00:01.335) 0:14:55.586 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 18 March 2026 20:32:38 -0400 (0:00:00.337) 0:14:55.923 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 18 March 2026 20:32:38 -0400 (0:00:00.399) 0:14:56.323 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 18 March 2026 20:32:38 -0400 (0:00:00.391) 0:14:56.714 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 18 March 2026 20:32:39 -0400 (0:00:00.400) 0:14:57.115 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 18 March 2026 20:32:39 -0400 (0:00:00.752) 0:14:57.868 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 18 March 2026 20:32:40 -0400 (0:00:00.703) 0:14:58.572 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 18 March 2026 20:32:41 -0400 (0:00:00.601) 0:14:59.174 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 18 March 2026 20:32:41 -0400 (0:00:00.550) 0:14:59.725 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 18 March 2026 20:32:42 -0400 (0:00:00.428) 0:15:00.153 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 18 March 2026 20:32:42 -0400 (0:00:00.332) 0:15:00.485 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 18 March 2026 20:32:42 -0400 (0:00:00.386) 0:15:00.872 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 18 March 2026 20:32:43 -0400 (0:00:00.445) 0:15:01.318 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 18 March 2026 20:32:43 -0400 (0:00:00.281) 0:15:01.599 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 18 March 2026 20:32:43 -0400 (0:00:00.280) 0:15:01.880 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 18 March 2026 20:32:44 -0400 (0:00:00.264) 0:15:02.145 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 18 March 2026 20:32:44 -0400 (0:00:00.248) 0:15:02.394 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 18 March 2026 20:32:44 -0400 (0:00:00.295) 0:15:02.689 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 18 March 2026 20:32:45 -0400 (0:00:00.225) 0:15:02.914 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 18 March 2026 20:32:45 -0400 (0:00:00.264) 0:15:03.179 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 18 March 2026 20:32:45 -0400 (0:00:00.265) 0:15:03.456 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 18 March 2026 20:32:45 -0400 (0:00:00.189) 0:15:03.645 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 18 March 2026 20:32:45 -0400 (0:00:00.202) 0:15:03.848 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 18 March 2026 20:32:46 -0400 (0:00:00.193) 0:15:04.041 ******* ok: [managed-node4] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 18 March 2026 20:32:49 -0400 (0:00:03.385) 0:15:07.427 ******* ok: [managed-node4] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 18 March 2026 20:32:50 -0400 (0:00:01.449) 0:15:08.876 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 18 March 2026 20:32:51 -0400 (0:00:00.474) 0:15:09.351 ******* ok: [managed-node4] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 18 March 2026 20:32:51 -0400 (0:00:00.302) 0:15:09.654 ******* ok: [managed-node4] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 18 March 2026 20:32:53 -0400 (0:00:01.371) 0:15:11.025 ******* skipping: [managed-node4] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 18 March 2026 20:32:53 -0400 (0:00:00.424) 0:15:11.450 ******* skipping: [managed-node4] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 18 March 2026 20:32:53 -0400 (0:00:00.246) 0:15:11.696 ******* skipping: [managed-node4] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 18 March 2026 20:32:54 -0400 (0:00:00.294) 0:15:11.990 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 18 March 2026 20:32:54 -0400 (0:00:00.334) 0:15:12.324 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 18 March 2026 20:32:54 -0400 (0:00:00.351) 0:15:12.676 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 18 March 2026 20:32:55 -0400 (0:00:00.374) 0:15:13.051 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 18 March 2026 20:32:55 -0400 (0:00:00.389) 0:15:13.441 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 18 March 2026 20:32:55 -0400 (0:00:00.389) 0:15:13.831 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 18 March 2026 20:32:56 -0400 (0:00:00.371) 0:15:14.202 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 18 March 2026 20:32:56 -0400 (0:00:00.327) 0:15:14.530 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 18 March 2026 20:32:57 -0400 (0:00:00.354) 0:15:14.885 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 18 March 2026 20:32:57 -0400 (0:00:00.299) 0:15:15.185 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 18 March 2026 20:32:57 -0400 (0:00:00.343) 0:15:15.528 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 18 March 2026 20:32:57 -0400 (0:00:00.303) 0:15:15.832 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 18 March 2026 20:32:58 -0400 (0:00:00.329) 0:15:16.162 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 18 March 2026 20:32:58 -0400 (0:00:00.390) 0:15:16.552 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 18 March 2026 20:32:58 -0400 (0:00:00.278) 0:15:16.836 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 18 March 2026 20:32:59 -0400 (0:00:00.285) 0:15:17.121 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 18 March 2026 20:32:59 -0400 (0:00:00.353) 0:15:17.474 ******* ok: [managed-node4] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 18 March 2026 20:32:59 -0400 (0:00:00.302) 0:15:17.777 ******* ok: [managed-node4] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 18 March 2026 20:33:00 -0400 (0:00:00.293) 0:15:18.071 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 18 March 2026 20:33:00 -0400 (0:00:00.518) 0:15:18.589 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.030498", "end": "2026-03-18 20:33:02.003236", "rc": 0, "start": "2026-03-18 20:33:01.972738" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 18 March 2026 20:33:02 -0400 (0:00:01.468) 0:15:20.058 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 18 March 2026 20:33:02 -0400 (0:00:00.441) 0:15:20.499 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 18 March 2026 20:33:03 -0400 (0:00:00.484) 0:15:20.983 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 18 March 2026 20:33:03 -0400 (0:00:00.351) 0:15:21.335 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 18 March 2026 20:33:03 -0400 (0:00:00.293) 0:15:21.629 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 18 March 2026 20:33:04 -0400 (0:00:00.329) 0:15:21.959 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 18 March 2026 20:33:04 -0400 (0:00:00.313) 0:15:22.273 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 18 March 2026 20:33:04 -0400 (0:00:00.295) 0:15:22.568 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 18 March 2026 20:33:04 -0400 (0:00:00.238) 0:15:22.806 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:401 Wednesday 18 March 2026 20:33:05 -0400 (0:00:00.323) 0:15:23.130 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 TASK [Clear facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:10 Wednesday 18 March 2026 20:33:05 -0400 (0:00:00.466) 0:15:23.597 ******* META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:33:05 -0400 (0:00:00.014) 0:15:23.611 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:33:06 -0400 (0:00:00.291) 0:15:23.903 ******* included: fedora.linux_system_roles.storage for managed-node4 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:33:06 -0400 (0:00:00.407) 0:15:24.311 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:33:06 -0400 (0:00:00.403) 0:15:24.715 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:33:09 -0400 (0:00:02.330) 0:15:27.047 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:33:09 -0400 (0:00:00.810) 0:15:27.858 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:33:10 -0400 (0:00:00.389) 0:15:28.247 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:33:10 -0400 (0:00:00.346) 0:15:28.594 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:33:11 -0400 (0:00:00.291) 0:15:28.885 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:33:11 -0400 (0:00:00.258) 0:15:29.143 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:33:12 -0400 (0:00:00.882) 0:15:30.026 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:33:14 -0400 (0:00:02.072) 0:15:32.099 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:33:14 -0400 (0:00:00.360) 0:15:32.460 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:33:14 -0400 (0:00:00.320) 0:15:32.780 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:33:17 -0400 (0:00:02.456) 0:15:35.237 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:33:17 -0400 (0:00:00.558) 0:15:35.796 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:33:18 -0400 (0:00:00.173) 0:15:35.969 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:33:21 -0400 (0:00:03.149) 0:15:39.119 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:33:21 -0400 (0:00:00.240) 0:15:39.359 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:33:23 -0400 (0:00:02.067) 0:15:41.427 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d0c01a968\\x2d7a32\\x2d44e8\\x2dab80\\x2d9164045a86b1.service": { "name": "systemd-cryptsetup@luks\\x2d0c01a968\\x2d7a32\\x2d44e8\\x2dab80\\x2d9164045a86b1.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:33:26 -0400 (0:00:03.076) 0:15:44.503 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d0c01a968\x2d7a32\x2d44e8\x2dab80\x2d9164045a86b1.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d0c01a968\\x2d7a32\\x2d44e8\\x2dab80\\x2d9164045a86b1.service", "name": "systemd-cryptsetup@luks\\x2d0c01a968\\x2d7a32\\x2d44e8\\x2dab80\\x2d9164045a86b1.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target tmp.mount -.mount \"system-systemd\\\\x2dcryptsetup.slice\" dev-sda1.device systemd-udevd-kernel.socket systemd-journald.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "\"blockdev@dev-mapper-luks\\\\x2d0c01a968\\\\x2d7a32\\\\x2d44e8\\\\x2dab80\\\\x2d9164045a86b1.target\" umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-0c01a968-7a32-44e8-ab80-9164045a86b1", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-0c01a968-7a32-44e8-ab80-9164045a86b1 /dev/sda1 /tmp/storage_testggyp06_hlukskey ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-0c01a968-7a32-44e8-ab80-9164045a86b1 /dev/sda1 /tmp/storage_testggyp06_hlukskey ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-0c01a968-7a32-44e8-ab80-9164045a86b1 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-0c01a968-7a32-44e8-ab80-9164045a86b1 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d0c01a968\\x2d7a32\\x2d44e8\\x2dab80\\x2d9164045a86b1.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d0c01a968\\x2d7a32\\x2d44e8\\x2dab80\\x2d9164045a86b1.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d0c01a968\\\\x2d7a32\\\\x2d44e8\\\\x2dab80\\\\x2d9164045a86b1.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "-.mount \"system-systemd\\\\x2dcryptsetup.slice\"", "RequiresMountsFor": "/tmp/storage_testggyp06_hlukskey", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2026-03-18 20:31:37 EDT", "StateChangeTimestampMonotonic": "2354496462", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d0c01a968\\\\x2d7a32\\\\x2d44e8\\\\x2dab80\\\\x2d9164045a86b1.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:33:28 -0400 (0:00:02.160) 0:15:46.663 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 20:33:31 -0400 (0:00:02.839) 0:15:49.503 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 20:33:31 -0400 (0:00:00.269) 0:15:49.772 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880295.1828272, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ebd66eaed45707a70cf51337cb298025e4a234d0", "ctime": 1773880295.1798272, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 171966659, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773880295.1798272, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1490, "uid": 0, "version": "2477030431", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 20:33:33 -0400 (0:00:01.415) 0:15:51.188 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "blivet_output is changed", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:33:33 -0400 (0:00:00.305) 0:15:51.494 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2d0c01a968\x2d7a32\x2d44e8\x2dab80\x2d9164045a86b1.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d0c01a968\\x2d7a32\\x2d44e8\\x2dab80\\x2d9164045a86b1.service", "name": "systemd-cryptsetup@luks\\x2d0c01a968\\x2d7a32\\x2d44e8\\x2dab80\\x2d9164045a86b1.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d0c01a968\\x2d7a32\\x2d44e8\\x2dab80\\x2d9164045a86b1.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d0c01a968\\x2d7a32\\x2d44e8\\x2dab80\\x2d9164045a86b1.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d0c01a968\\x2d7a32\\x2d44e8\\x2dab80\\x2d9164045a86b1.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454168576", "LimitMEMLOCKSoft": "454168576", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d0c01a968\\x2d7a32\\x2d44e8\\x2dab80\\x2d9164045a86b1.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d0c01a968\\\\x2d7a32\\\\x2d44e8\\\\x2dab80\\\\x2d9164045a86b1.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 20:33:35 -0400 (0:00:02.081) 0:15:53.576 ******* ok: [managed-node4] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 20:33:36 -0400 (0:00:00.501) 0:15:54.077 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 20:33:36 -0400 (0:00:00.356) 0:15:54.434 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 20:33:36 -0400 (0:00:00.289) 0:15:54.724 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 20:33:37 -0400 (0:00:00.365) 0:15:55.089 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 20:33:38 -0400 (0:00:01.761) 0:15:56.851 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount ok: [managed-node4] => (item={'src': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 20:33:40 -0400 (0:00:01.586) 0:15:58.462 ******* skipping: [managed-node4] => (item={'src': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 20:33:41 -0400 (0:00:00.549) 0:15:59.012 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 20:33:42 -0400 (0:00:01.670) 0:16:00.682 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880307.910853, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "7eb0c7b85e385ff27a39fdeed0099fd6f26adfe9", "ctime": 1773880301.029839, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 213909639, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1773880301.0303779, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "3757671366", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 20:33:44 -0400 (0:00:01.378) 0:16:02.060 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 20:33:44 -0400 (0:00:00.250) 0:16:02.311 ******* ok: [managed-node4] TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:414 Wednesday 18 March 2026 20:33:46 -0400 (0:00:02.039) 0:16:04.350 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify role results - 8] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:421 Wednesday 18 March 2026 20:33:46 -0400 (0:00:00.256) 0:16:04.625 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node4 TASK [Print out pool information] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 18 March 2026 20:33:47 -0400 (0:00:00.444) 0:16:05.070 ******* ok: [managed-node4] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 18 March 2026 20:33:47 -0400 (0:00:00.266) 0:16:05.336 ******* skipping: [managed-node4] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 18 March 2026 20:33:47 -0400 (0:00:00.225) 0:16:05.562 ******* ok: [managed-node4] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "e3dc7518-65da-4f95-b3a9-fe385f43a1b3" }, "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "size": "4G", "type": "crypt", "uuid": "f73ca6b2-ae10-46a8-8f70-81fdeeeee1d7" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "67DFfz-RcXl-0Z5W-fL92-nvtr-VU62-rytR2b" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "63c9c93d-438e-4c50-886b-6f947888611e" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 18 March 2026 20:33:48 -0400 (0:00:01.116) 0:16:06.679 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003005", "end": "2026-03-18 20:33:49.893787", "rc": 0, "start": "2026-03-18 20:33:49.890782" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Mar 10 07:51:51 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=63c9c93d-438e-4c50-886b-6f947888611e / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 18 March 2026 20:33:50 -0400 (0:00:01.263) 0:16:07.942 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003067", "end": "2026-03-18 20:33:51.047492", "failed_when_result": false, "rc": 0, "start": "2026-03-18 20:33:51.044425" } STDOUT: luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 18 March 2026 20:33:51 -0400 (0:00:01.152) 0:16:09.095 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node4 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 18 March 2026 20:33:51 -0400 (0:00:00.549) 0:16:09.644 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 18 March 2026 20:33:51 -0400 (0:00:00.235) 0:16:09.880 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.029979", "end": "2026-03-18 20:33:53.093145", "rc": 0, "start": "2026-03-18 20:33:53.063166" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 18 March 2026 20:33:53 -0400 (0:00:01.230) 0:16:11.110 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 18 March 2026 20:33:53 -0400 (0:00:00.323) 0:16:11.434 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node4 => (item=members) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node4 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 18 March 2026 20:33:54 -0400 (0:00:00.723) 0:16:12.157 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 18 March 2026 20:33:54 -0400 (0:00:00.419) 0:16:12.576 ******* ok: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 18 March 2026 20:33:55 -0400 (0:00:01.096) 0:16:13.673 ******* ok: [managed-node4] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 18 March 2026 20:33:56 -0400 (0:00:00.286) 0:16:13.960 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 18 March 2026 20:33:56 -0400 (0:00:00.309) 0:16:14.270 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 18 March 2026 20:33:56 -0400 (0:00:00.298) 0:16:14.568 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 18 March 2026 20:33:56 -0400 (0:00:00.239) 0:16:14.808 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 18 March 2026 20:33:57 -0400 (0:00:00.340) 0:16:15.148 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Wednesday 18 March 2026 20:33:57 -0400 (0:00:00.321) 0:16:15.469 ******* ok: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Wednesday 18 March 2026 20:33:57 -0400 (0:00:00.259) 0:16:15.729 ******* ok: [managed-node4] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:101960): WARNING **: 20:33:58.556: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.5 27 Jan 2026 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.221 originally 10.31.13.221 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.221 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.221 originally 10.31.13.221 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e0f745688c' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.221 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Wednesday 18 March 2026 20:33:58 -0400 (0:00:00.978) 0:16:16.708 ******* skipping: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "storage_test_pool.grow_to_fill | bool", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Wednesday 18 March 2026 20:33:59 -0400 (0:00:00.267) 0:16:16.975 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node4 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 18 March 2026 20:33:59 -0400 (0:00:00.390) 0:16:17.366 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 18 March 2026 20:33:59 -0400 (0:00:00.189) 0:16:17.555 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 18 March 2026 20:33:59 -0400 (0:00:00.094) 0:16:17.650 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 18 March 2026 20:33:59 -0400 (0:00:00.096) 0:16:17.747 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 18 March 2026 20:33:59 -0400 (0:00:00.122) 0:16:17.869 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 18 March 2026 20:34:00 -0400 (0:00:00.141) 0:16:18.011 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 18 March 2026 20:34:00 -0400 (0:00:00.216) 0:16:18.228 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 18 March 2026 20:34:00 -0400 (0:00:00.158) 0:16:18.386 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 18 March 2026 20:34:00 -0400 (0:00:00.262) 0:16:18.648 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 18 March 2026 20:34:00 -0400 (0:00:00.198) 0:16:18.847 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 18 March 2026 20:34:01 -0400 (0:00:00.209) 0:16:19.057 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Wednesday 18 March 2026 20:34:01 -0400 (0:00:00.278) 0:16:19.335 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node4 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 18 March 2026 20:34:02 -0400 (0:00:00.657) 0:16:19.993 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node4 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Wednesday 18 March 2026 20:34:02 -0400 (0:00:00.573) 0:16:20.566 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Wednesday 18 March 2026 20:34:02 -0400 (0:00:00.281) 0:16:20.848 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Wednesday 18 March 2026 20:34:03 -0400 (0:00:00.223) 0:16:21.071 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Wednesday 18 March 2026 20:34:03 -0400 (0:00:00.312) 0:16:21.384 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Wednesday 18 March 2026 20:34:03 -0400 (0:00:00.336) 0:16:21.720 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Wednesday 18 March 2026 20:34:04 -0400 (0:00:00.296) 0:16:22.016 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Wednesday 18 March 2026 20:34:04 -0400 (0:00:00.307) 0:16:22.324 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Wednesday 18 March 2026 20:34:04 -0400 (0:00:00.305) 0:16:22.630 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node4 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 18 March 2026 20:34:05 -0400 (0:00:00.590) 0:16:23.220 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node4 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Wednesday 18 March 2026 20:34:05 -0400 (0:00:00.523) 0:16:23.744 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Wednesday 18 March 2026 20:34:06 -0400 (0:00:00.237) 0:16:23.981 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Wednesday 18 March 2026 20:34:06 -0400 (0:00:00.197) 0:16:24.179 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Wednesday 18 March 2026 20:34:06 -0400 (0:00:00.210) 0:16:24.389 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Wednesday 18 March 2026 20:34:06 -0400 (0:00:00.260) 0:16:24.650 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node4 TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 18 March 2026 20:34:07 -0400 (0:00:00.594) 0:16:25.244 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 18 March 2026 20:34:07 -0400 (0:00:00.290) 0:16:25.535 ******* skipping: [managed-node4] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 18 March 2026 20:34:07 -0400 (0:00:00.243) 0:16:25.778 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node4 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Wednesday 18 March 2026 20:34:08 -0400 (0:00:00.504) 0:16:26.283 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Wednesday 18 March 2026 20:34:08 -0400 (0:00:00.419) 0:16:26.703 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Wednesday 18 March 2026 20:34:09 -0400 (0:00:00.340) 0:16:27.044 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Wednesday 18 March 2026 20:34:09 -0400 (0:00:00.256) 0:16:27.301 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Wednesday 18 March 2026 20:34:09 -0400 (0:00:00.252) 0:16:27.554 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Wednesday 18 March 2026 20:34:09 -0400 (0:00:00.228) 0:16:27.804 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 18 March 2026 20:34:10 -0400 (0:00:00.230) 0:16:28.035 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Wednesday 18 March 2026 20:34:10 -0400 (0:00:00.211) 0:16:28.246 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node4 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 18 March 2026 20:34:11 -0400 (0:00:00.701) 0:16:28.948 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node4 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Wednesday 18 March 2026 20:34:11 -0400 (0:00:00.487) 0:16:29.435 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Wednesday 18 March 2026 20:34:11 -0400 (0:00:00.221) 0:16:29.657 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Wednesday 18 March 2026 20:34:12 -0400 (0:00:00.241) 0:16:29.898 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Wednesday 18 March 2026 20:34:12 -0400 (0:00:00.255) 0:16:30.154 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Wednesday 18 March 2026 20:34:12 -0400 (0:00:00.222) 0:16:30.376 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Wednesday 18 March 2026 20:34:12 -0400 (0:00:00.203) 0:16:30.579 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Wednesday 18 March 2026 20:34:12 -0400 (0:00:00.241) 0:16:30.820 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Wednesday 18 March 2026 20:34:13 -0400 (0:00:00.323) 0:16:31.144 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node4 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 18 March 2026 20:34:13 -0400 (0:00:00.693) 0:16:31.837 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 18 March 2026 20:34:14 -0400 (0:00:00.268) 0:16:32.106 ******* skipping: [managed-node4] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Wednesday 18 March 2026 20:34:14 -0400 (0:00:00.223) 0:16:32.329 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Wednesday 18 March 2026 20:34:14 -0400 (0:00:00.280) 0:16:32.636 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Wednesday 18 March 2026 20:34:14 -0400 (0:00:00.224) 0:16:32.860 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Wednesday 18 March 2026 20:34:15 -0400 (0:00:00.266) 0:16:33.127 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Wednesday 18 March 2026 20:34:15 -0400 (0:00:00.264) 0:16:33.392 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Wednesday 18 March 2026 20:34:15 -0400 (0:00:00.225) 0:16:33.617 ******* ok: [managed-node4] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 18 March 2026 20:34:16 -0400 (0:00:00.270) 0:16:33.888 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node4 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 18 March 2026 20:34:16 -0400 (0:00:00.418) 0:16:34.307 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 18 March 2026 20:34:16 -0400 (0:00:00.312) 0:16:34.619 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node4 => (item=mount) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node4 => (item=fstab) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node4 => (item=fs) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node4 => (item=device) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node4 => (item=encryption) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node4 => (item=md) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node4 => (item=size) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node4 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 18 March 2026 20:34:18 -0400 (0:00:01.947) 0:16:36.567 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 18 March 2026 20:34:19 -0400 (0:00:00.393) 0:16:36.961 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 18 March 2026 20:34:19 -0400 (0:00:00.394) 0:16:37.356 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 18 March 2026 20:34:19 -0400 (0:00:00.490) 0:16:37.846 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 18 March 2026 20:34:20 -0400 (0:00:00.443) 0:16:38.289 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 18 March 2026 20:34:20 -0400 (0:00:00.375) 0:16:38.665 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 18 March 2026 20:34:21 -0400 (0:00:00.366) 0:16:39.031 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 18 March 2026 20:34:21 -0400 (0:00:00.445) 0:16:39.477 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 18 March 2026 20:34:23 -0400 (0:00:01.533) 0:16:41.010 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 18 March 2026 20:34:23 -0400 (0:00:00.211) 0:16:41.222 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 18 March 2026 20:34:23 -0400 (0:00:00.229) 0:16:41.451 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 18 March 2026 20:34:23 -0400 (0:00:00.241) 0:16:41.693 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 18 March 2026 20:34:24 -0400 (0:00:00.772) 0:16:42.466 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 18 March 2026 20:34:24 -0400 (0:00:00.365) 0:16:42.831 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 18 March 2026 20:34:25 -0400 (0:00:00.483) 0:16:43.315 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 18 March 2026 20:34:25 -0400 (0:00:00.327) 0:16:43.642 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 18 March 2026 20:34:26 -0400 (0:00:00.381) 0:16:44.024 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 18 March 2026 20:34:26 -0400 (0:00:00.253) 0:16:44.278 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 18 March 2026 20:34:26 -0400 (0:00:00.410) 0:16:44.689 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 18 March 2026 20:34:27 -0400 (0:00:00.442) 0:16:45.132 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880357.5199528, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1773880285.3588076, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2298, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1773880285.3588076, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 18 March 2026 20:34:28 -0400 (0:00:01.290) 0:16:46.423 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 18 March 2026 20:34:28 -0400 (0:00:00.285) 0:16:46.708 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 18 March 2026 20:34:29 -0400 (0:00:00.199) 0:16:46.907 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 18 March 2026 20:34:29 -0400 (0:00:00.324) 0:16:47.232 ******* ok: [managed-node4] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 18 March 2026 20:34:29 -0400 (0:00:00.322) 0:16:47.554 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 18 March 2026 20:34:29 -0400 (0:00:00.224) 0:16:47.779 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 18 March 2026 20:34:30 -0400 (0:00:00.343) 0:16:48.123 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880411.329061, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1773880285.8818085, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2346, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1773880285.8818085, "nlink": 1, "path": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 18 March 2026 20:34:31 -0400 (0:00:01.346) 0:16:49.469 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 18 March 2026 20:34:33 -0400 (0:00:01.801) 0:16:51.270 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.007393", "end": "2026-03-18 20:34:34.614813", "rc": 0, "start": "2026-03-18 20:34:34.607420" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: e3dc7518-65da-4f95-b3a9-fe385f43a1b3 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 687099 Threads: 2 Salt: b4 a0 df e2 6e f0 20 d7 e5 17 4d b1 5e 21 a8 a0 82 0e cc 9c 02 d6 21 c6 65 fc c2 50 df cc f3 7b AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 131730 Salt: 1b 12 51 0d 4a 2b 15 79 43 9c e4 29 1a ca bf 06 16 d3 25 98 7f 96 2d 51 0b 55 19 13 48 b5 99 d0 Digest: 4b fa ca 4c c8 cc 74 f9 ff ba 2a 68 13 bd 61 94 5d 55 61 86 d9 c4 a1 4a e2 28 0b 50 b5 1b 88 b9 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 18 March 2026 20:34:34 -0400 (0:00:01.404) 0:16:52.675 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 18 March 2026 20:34:35 -0400 (0:00:00.317) 0:16:52.993 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 18 March 2026 20:34:35 -0400 (0:00:00.391) 0:16:53.384 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 18 March 2026 20:34:35 -0400 (0:00:00.371) 0:16:53.755 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 18 March 2026 20:34:36 -0400 (0:00:00.297) 0:16:54.053 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 18 March 2026 20:34:36 -0400 (0:00:00.493) 0:16:54.572 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.encryption_key_size > 0", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 18 March 2026 20:34:37 -0400 (0:00:00.422) 0:16:54.994 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 18 March 2026 20:34:37 -0400 (0:00:00.354) 0:16:55.349 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 18 March 2026 20:34:38 -0400 (0:00:00.567) 0:16:55.917 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 18 March 2026 20:34:38 -0400 (0:00:00.378) 0:16:56.295 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 18 March 2026 20:34:38 -0400 (0:00:00.415) 0:16:56.711 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 18 March 2026 20:34:39 -0400 (0:00:00.390) 0:16:57.101 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 18 March 2026 20:34:39 -0400 (0:00:00.428) 0:16:57.530 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 18 March 2026 20:34:39 -0400 (0:00:00.260) 0:16:57.791 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 18 March 2026 20:34:40 -0400 (0:00:00.258) 0:16:58.050 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 18 March 2026 20:34:40 -0400 (0:00:00.293) 0:16:58.343 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 18 March 2026 20:34:40 -0400 (0:00:00.211) 0:16:58.554 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 18 March 2026 20:34:40 -0400 (0:00:00.250) 0:16:58.805 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 18 March 2026 20:34:41 -0400 (0:00:00.210) 0:16:59.016 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 18 March 2026 20:34:41 -0400 (0:00:00.182) 0:16:59.199 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 18 March 2026 20:34:41 -0400 (0:00:00.251) 0:16:59.450 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 18 March 2026 20:34:41 -0400 (0:00:00.223) 0:16:59.674 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 18 March 2026 20:34:42 -0400 (0:00:00.221) 0:16:59.895 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 18 March 2026 20:34:42 -0400 (0:00:00.177) 0:17:00.073 ******* ok: [managed-node4] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 18 March 2026 20:34:43 -0400 (0:00:01.388) 0:17:01.461 ******* ok: [managed-node4] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 18 March 2026 20:34:44 -0400 (0:00:01.371) 0:17:02.833 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 18 March 2026 20:34:45 -0400 (0:00:00.386) 0:17:03.219 ******* ok: [managed-node4] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 18 March 2026 20:34:45 -0400 (0:00:00.421) 0:17:03.640 ******* ok: [managed-node4] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 18 March 2026 20:34:47 -0400 (0:00:01.417) 0:17:05.058 ******* skipping: [managed-node4] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 18 March 2026 20:34:47 -0400 (0:00:00.302) 0:17:05.360 ******* skipping: [managed-node4] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 18 March 2026 20:34:47 -0400 (0:00:00.359) 0:17:05.720 ******* skipping: [managed-node4] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 18 March 2026 20:34:48 -0400 (0:00:00.349) 0:17:06.069 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 18 March 2026 20:34:48 -0400 (0:00:00.351) 0:17:06.421 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 18 March 2026 20:34:48 -0400 (0:00:00.328) 0:17:06.749 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 18 March 2026 20:34:49 -0400 (0:00:00.343) 0:17:07.093 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 18 March 2026 20:34:49 -0400 (0:00:00.316) 0:17:07.410 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 18 March 2026 20:34:49 -0400 (0:00:00.358) 0:17:07.768 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 18 March 2026 20:34:50 -0400 (0:00:00.353) 0:17:08.122 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 18 March 2026 20:34:50 -0400 (0:00:00.389) 0:17:08.511 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 18 March 2026 20:34:51 -0400 (0:00:00.492) 0:17:09.003 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 18 March 2026 20:34:51 -0400 (0:00:00.420) 0:17:09.423 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 18 March 2026 20:34:51 -0400 (0:00:00.363) 0:17:09.787 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 18 March 2026 20:34:52 -0400 (0:00:00.297) 0:17:10.085 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 18 March 2026 20:34:52 -0400 (0:00:00.305) 0:17:10.390 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 18 March 2026 20:34:52 -0400 (0:00:00.274) 0:17:10.664 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 18 March 2026 20:34:53 -0400 (0:00:00.289) 0:17:10.954 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 18 March 2026 20:34:53 -0400 (0:00:00.272) 0:17:11.227 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 18 March 2026 20:34:53 -0400 (0:00:00.354) 0:17:11.582 ******* ok: [managed-node4] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 18 March 2026 20:34:54 -0400 (0:00:00.360) 0:17:11.942 ******* ok: [managed-node4] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 18 March 2026 20:34:54 -0400 (0:00:00.361) 0:17:12.304 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 18 March 2026 20:34:54 -0400 (0:00:00.562) 0:17:12.866 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.023333", "end": "2026-03-18 20:34:56.248143", "rc": 0, "start": "2026-03-18 20:34:56.224810" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 18 March 2026 20:34:56 -0400 (0:00:01.571) 0:17:14.437 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 18 March 2026 20:34:56 -0400 (0:00:00.143) 0:17:14.581 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 18 March 2026 20:34:57 -0400 (0:00:00.371) 0:17:14.953 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 18 March 2026 20:34:57 -0400 (0:00:00.260) 0:17:15.214 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 18 March 2026 20:34:57 -0400 (0:00:00.249) 0:17:15.464 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 18 March 2026 20:34:57 -0400 (0:00:00.303) 0:17:15.767 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 18 March 2026 20:34:58 -0400 (0:00:00.300) 0:17:16.067 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 18 March 2026 20:34:58 -0400 (0:00:00.271) 0:17:16.339 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 18 March 2026 20:34:58 -0400 (0:00:00.239) 0:17:16.578 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 18 March 2026 20:34:59 -0400 (0:00:00.373) 0:17:16.952 ******* changed: [managed-node4] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 5] ****************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:427 Wednesday 18 March 2026 20:35:00 -0400 (0:00:01.549) 0:17:18.501 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node4 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 18 March 2026 20:35:01 -0400 (0:00:00.536) 0:17:19.037 ******* ok: [managed-node4] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 18 March 2026 20:35:01 -0400 (0:00:00.303) 0:17:19.341 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 TASK [Clear facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:10 Wednesday 18 March 2026 20:35:01 -0400 (0:00:00.343) 0:17:19.685 ******* META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:35:01 -0400 (0:00:00.028) 0:17:19.713 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:35:02 -0400 (0:00:00.295) 0:17:20.009 ******* included: fedora.linux_system_roles.storage for managed-node4 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:35:02 -0400 (0:00:00.375) 0:17:20.386 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:35:02 -0400 (0:00:00.283) 0:17:20.670 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:35:05 -0400 (0:00:02.323) 0:17:22.994 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:35:05 -0400 (0:00:00.885) 0:17:23.880 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:35:06 -0400 (0:00:00.310) 0:17:24.190 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:35:06 -0400 (0:00:00.476) 0:17:24.667 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:35:07 -0400 (0:00:00.263) 0:17:24.930 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:35:07 -0400 (0:00:00.167) 0:17:25.098 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:35:07 -0400 (0:00:00.704) 0:17:25.802 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:35:09 -0400 (0:00:02.074) 0:17:27.877 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:35:10 -0400 (0:00:00.320) 0:17:28.198 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:35:10 -0400 (0:00:00.353) 0:17:28.551 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:35:13 -0400 (0:00:02.610) 0:17:31.162 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:35:13 -0400 (0:00:00.605) 0:17:31.767 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:35:14 -0400 (0:00:00.238) 0:17:32.005 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:35:14 -0400 (0:00:00.289) 0:17:32.295 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:35:14 -0400 (0:00:00.236) 0:17:32.531 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:35:16 -0400 (0:00:01.896) 0:17:34.428 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service": { "name": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:35:19 -0400 (0:00:03.012) 0:17:37.440 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2de3dc7518\x2d65da\x2d4f95\x2db3a9\x2dfe385f43a1b3.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "name": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target systemd-udevd-kernel.socket \"dev-mapper-foo\\\\x2dtest1.device\" systemd-journald.socket \"system-systemd\\\\x2dcryptsetup.slice\"", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target \"blockdev@dev-mapper-luks\\\\x2de3dc7518\\\\x2d65da\\\\x2d4f95\\\\x2db3a9\\\\x2dfe385f43a1b3.target\"", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3 /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2de3dc7518\\\\x2d65da\\\\x2d4f95\\\\x2db3a9\\\\x2dfe385f43a1b3.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "\"dev-mapper-luks\\\\x2de3dc7518\\\\x2d65da\\\\x2d4f95\\\\x2db3a9\\\\x2dfe385f43a1b3.device\" cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2026-03-18 20:33:35 EDT", "StateChangeTimestampMonotonic": "2472809262", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2de3dc7518\\\\x2d65da\\\\x2d4f95\\\\x2db3a9\\\\x2dfe385f43a1b3.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:35:21 -0400 (0:00:01.949) 0:17:39.390 ******* fatal: [managed-node4]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 18 March 2026 20:35:23 -0400 (0:00:02.378) 0:17:41.768 ******* fatal: [managed-node4]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3' in safe mode due to encryption removal", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:35:24 -0400 (0:00:00.331) 0:17:42.100 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2de3dc7518\x2d65da\x2d4f95\x2db3a9\x2dfe385f43a1b3.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "name": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454168576", "LimitMEMLOCKSoft": "454168576", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2de3dc7518\\\\x2d65da\\\\x2d4f95\\\\x2db3a9\\\\x2dfe385f43a1b3.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target \"dev-mapper-luks\\\\x2de3dc7518\\\\x2d65da\\\\x2d4f95\\\\x2db3a9\\\\x2dfe385f43a1b3.device\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2026-03-18 20:33:35 EDT", "StateChangeTimestampMonotonic": "2472809262", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Wednesday 18 March 2026 20:35:26 -0400 (0:00:01.937) 0:17:44.038 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Wednesday 18 March 2026 20:35:26 -0400 (0:00:00.307) 0:17:44.346 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Wednesday 18 March 2026 20:35:26 -0400 (0:00:00.515) 0:17:44.861 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 18 March 2026 20:35:27 -0400 (0:00:00.309) 0:17:45.171 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880500.39824, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1773880500.39824, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1773880500.39824, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2157974748", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 18 March 2026 20:35:28 -0400 (0:00:01.442) 0:17:46.614 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 3] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:451 Wednesday 18 March 2026 20:35:29 -0400 (0:00:00.565) 0:17:47.179 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 TASK [Clear facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:10 Wednesday 18 March 2026 20:35:30 -0400 (0:00:00.726) 0:17:47.905 ******* META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:35:30 -0400 (0:00:00.046) 0:17:47.951 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:35:30 -0400 (0:00:00.268) 0:17:48.220 ******* included: fedora.linux_system_roles.storage for managed-node4 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:35:30 -0400 (0:00:00.404) 0:17:48.624 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:35:31 -0400 (0:00:00.309) 0:17:48.933 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:35:33 -0400 (0:00:02.259) 0:17:51.193 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:35:34 -0400 (0:00:00.779) 0:17:51.973 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:35:34 -0400 (0:00:00.273) 0:17:52.246 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:35:34 -0400 (0:00:00.289) 0:17:52.536 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:35:34 -0400 (0:00:00.200) 0:17:52.737 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:35:35 -0400 (0:00:00.339) 0:17:53.076 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:35:36 -0400 (0:00:00.819) 0:17:53.896 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:35:38 -0400 (0:00:02.051) 0:17:55.948 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:35:38 -0400 (0:00:00.328) 0:17:56.276 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:35:38 -0400 (0:00:00.341) 0:17:56.618 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:35:41 -0400 (0:00:02.534) 0:17:59.153 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:35:41 -0400 (0:00:00.536) 0:17:59.690 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:35:42 -0400 (0:00:00.239) 0:17:59.930 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:35:42 -0400 (0:00:00.192) 0:18:00.122 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:35:42 -0400 (0:00:00.167) 0:18:00.290 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:35:44 -0400 (0:00:01.832) 0:18:02.122 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service": { "name": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:35:47 -0400 (0:00:03.133) 0:18:05.255 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2de3dc7518\x2d65da\x2d4f95\x2db3a9\x2dfe385f43a1b3.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "name": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target \"system-systemd\\\\x2dcryptsetup.slice\" systemd-journald.socket \"dev-mapper-foo\\\\x2dtest1.device\" systemd-udevd-kernel.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target \"blockdev@dev-mapper-luks\\\\x2de3dc7518\\\\x2d65da\\\\x2d4f95\\\\x2db3a9\\\\x2dfe385f43a1b3.target\" umount.target", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3 /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2de3dc7518\\\\x2d65da\\\\x2d4f95\\\\x2db3a9\\\\x2dfe385f43a1b3.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target \"dev-mapper-luks\\\\x2de3dc7518\\\\x2d65da\\\\x2d4f95\\\\x2db3a9\\\\x2dfe385f43a1b3.device\"", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2026-03-18 20:33:35 EDT", "StateChangeTimestampMonotonic": "2472809262", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2de3dc7518\\\\x2d65da\\\\x2d4f95\\\\x2db3a9\\\\x2dfe385f43a1b3.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:35:49 -0400 (0:00:01.962) 0:18:07.218 ******* changed: [managed-node4] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "password": "-", "state": "absent" } ], "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 20:35:52 -0400 (0:00:03.474) 0:18:10.693 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 20:35:53 -0400 (0:00:00.207) 0:18:10.900 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880295.1828272, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "ebd66eaed45707a70cf51337cb298025e4a234d0", "ctime": 1773880295.1798272, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 171966659, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773880295.1798272, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1490, "uid": 0, "version": "2477030431", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 20:35:54 -0400 (0:00:01.237) 0:18:12.138 ******* ok: [managed-node4] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:35:55 -0400 (0:00:01.291) 0:18:13.429 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2de3dc7518\x2d65da\x2d4f95\x2db3a9\x2dfe385f43a1b3.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "name": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454168576", "LimitMEMLOCKSoft": "454168576", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2de3dc7518\\\\x2d65da\\\\x2d4f95\\\\x2db3a9\\\\x2dfe385f43a1b3.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target \"dev-mapper-luks\\\\x2de3dc7518\\\\x2d65da\\\\x2d4f95\\\\x2db3a9\\\\x2dfe385f43a1b3.device\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2026-03-18 20:33:35 EDT", "StateChangeTimestampMonotonic": "2472809262", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 20:35:57 -0400 (0:00:02.007) 0:18:15.437 ******* ok: [managed-node4] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 20:35:57 -0400 (0:00:00.364) 0:18:15.801 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 20:35:58 -0400 (0:00:00.306) 0:18:16.108 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 20:35:58 -0400 (0:00:00.402) 0:18:16.511 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node4] => (item={'src': '/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 20:36:00 -0400 (0:00:01.408) 0:18:17.920 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 20:36:01 -0400 (0:00:01.722) 0:18:19.642 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node4] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 20:36:03 -0400 (0:00:01.472) 0:18:21.115 ******* skipping: [managed-node4] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 20:36:03 -0400 (0:00:00.541) 0:18:21.656 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 20:36:05 -0400 (0:00:01.709) 0:18:23.365 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880307.910853, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "7eb0c7b85e385ff27a39fdeed0099fd6f26adfe9", "ctime": 1773880301.029839, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 213909639, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1773880301.0303779, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "3757671366", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 20:36:06 -0400 (0:00:01.139) 0:18:24.505 ******* changed: [managed-node4] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 20:36:08 -0400 (0:00:01.548) 0:18:26.053 ******* ok: [managed-node4] TASK [Verify role results - 9] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:467 Wednesday 18 March 2026 20:36:09 -0400 (0:00:01.724) 0:18:27.777 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node4 TASK [Print out pool information] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 18 March 2026 20:36:10 -0400 (0:00:00.554) 0:18:28.332 ******* ok: [managed-node4] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 18 March 2026 20:36:10 -0400 (0:00:00.269) 0:18:28.601 ******* skipping: [managed-node4] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 18 March 2026 20:36:10 -0400 (0:00:00.176) 0:18:28.777 ******* ok: [managed-node4] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "6ad385ab-9fb2-4e60-a058-8208bc196156" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "67DFfz-RcXl-0Z5W-fL92-nvtr-VU62-rytR2b" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "63c9c93d-438e-4c50-886b-6f947888611e" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 18 March 2026 20:36:12 -0400 (0:00:01.150) 0:18:29.928 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003025", "end": "2026-03-18 20:36:13.048340", "rc": 0, "start": "2026-03-18 20:36:13.045315" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Mar 10 07:51:51 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=63c9c93d-438e-4c50-886b-6f947888611e / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 18 March 2026 20:36:13 -0400 (0:00:01.134) 0:18:31.062 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002992", "end": "2026-03-18 20:36:14.221779", "failed_when_result": false, "rc": 0, "start": "2026-03-18 20:36:14.218787" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 18 March 2026 20:36:14 -0400 (0:00:01.226) 0:18:32.288 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node4 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 18 March 2026 20:36:14 -0400 (0:00:00.476) 0:18:32.765 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 18 March 2026 20:36:15 -0400 (0:00:00.307) 0:18:33.072 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.031723", "end": "2026-03-18 20:36:16.421132", "rc": 0, "start": "2026-03-18 20:36:16.389409" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 18 March 2026 20:36:18 -0400 (0:00:02.973) 0:18:36.046 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 18 March 2026 20:36:18 -0400 (0:00:00.445) 0:18:36.492 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node4 => (item=members) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node4 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 18 March 2026 20:36:19 -0400 (0:00:00.855) 0:18:37.347 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 18 March 2026 20:36:19 -0400 (0:00:00.491) 0:18:37.839 ******* ok: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 18 March 2026 20:36:21 -0400 (0:00:01.357) 0:18:39.197 ******* ok: [managed-node4] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 18 March 2026 20:36:21 -0400 (0:00:00.276) 0:18:39.474 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 18 March 2026 20:36:21 -0400 (0:00:00.302) 0:18:39.777 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 18 March 2026 20:36:22 -0400 (0:00:00.410) 0:18:40.213 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 18 March 2026 20:36:22 -0400 (0:00:00.377) 0:18:40.591 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 18 March 2026 20:36:23 -0400 (0:00:00.409) 0:18:41.001 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Wednesday 18 March 2026 20:36:23 -0400 (0:00:00.237) 0:18:41.238 ******* ok: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Wednesday 18 March 2026 20:36:23 -0400 (0:00:00.425) 0:18:41.663 ******* ok: [managed-node4] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:109029): WARNING **: 20:36:24.763: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.5 27 Jan 2026 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.221 originally 10.31.13.221 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.221 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.221 originally 10.31.13.221 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e0f745688c' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.221 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Wednesday 18 March 2026 20:36:25 -0400 (0:00:01.291) 0:18:42.955 ******* skipping: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "storage_test_pool.grow_to_fill | bool", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Wednesday 18 March 2026 20:36:25 -0400 (0:00:00.269) 0:18:43.224 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node4 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 18 March 2026 20:36:26 -0400 (0:00:00.830) 0:18:44.054 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 18 March 2026 20:36:26 -0400 (0:00:00.311) 0:18:44.366 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 18 March 2026 20:36:26 -0400 (0:00:00.277) 0:18:44.643 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 18 March 2026 20:36:27 -0400 (0:00:00.285) 0:18:44.929 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 18 March 2026 20:36:27 -0400 (0:00:00.247) 0:18:45.177 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 18 March 2026 20:36:27 -0400 (0:00:00.371) 0:18:45.548 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 18 March 2026 20:36:27 -0400 (0:00:00.282) 0:18:45.831 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 18 March 2026 20:36:28 -0400 (0:00:00.199) 0:18:46.030 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 18 March 2026 20:36:28 -0400 (0:00:00.183) 0:18:46.214 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 18 March 2026 20:36:28 -0400 (0:00:00.193) 0:18:46.407 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 18 March 2026 20:36:28 -0400 (0:00:00.245) 0:18:46.653 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Wednesday 18 March 2026 20:36:29 -0400 (0:00:00.286) 0:18:46.939 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node4 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 18 March 2026 20:36:29 -0400 (0:00:00.585) 0:18:47.525 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node4 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Wednesday 18 March 2026 20:36:30 -0400 (0:00:00.674) 0:18:48.199 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Wednesday 18 March 2026 20:36:30 -0400 (0:00:00.348) 0:18:48.547 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Wednesday 18 March 2026 20:36:30 -0400 (0:00:00.307) 0:18:48.855 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Wednesday 18 March 2026 20:36:31 -0400 (0:00:00.356) 0:18:49.212 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Wednesday 18 March 2026 20:36:31 -0400 (0:00:00.218) 0:18:49.430 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Wednesday 18 March 2026 20:36:31 -0400 (0:00:00.270) 0:18:49.700 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Wednesday 18 March 2026 20:36:32 -0400 (0:00:00.296) 0:18:49.997 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Wednesday 18 March 2026 20:36:32 -0400 (0:00:00.282) 0:18:50.279 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node4 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 18 March 2026 20:36:32 -0400 (0:00:00.458) 0:18:50.738 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node4 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Wednesday 18 March 2026 20:36:33 -0400 (0:00:00.520) 0:18:51.258 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Wednesday 18 March 2026 20:36:33 -0400 (0:00:00.138) 0:18:51.397 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Wednesday 18 March 2026 20:36:33 -0400 (0:00:00.188) 0:18:51.586 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Wednesday 18 March 2026 20:36:33 -0400 (0:00:00.187) 0:18:51.774 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Wednesday 18 March 2026 20:36:34 -0400 (0:00:00.233) 0:18:52.007 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node4 TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 18 March 2026 20:36:34 -0400 (0:00:00.475) 0:18:52.483 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 18 March 2026 20:36:34 -0400 (0:00:00.247) 0:18:52.730 ******* skipping: [managed-node4] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 18 March 2026 20:36:35 -0400 (0:00:00.260) 0:18:52.991 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node4 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Wednesday 18 March 2026 20:36:35 -0400 (0:00:00.506) 0:18:53.498 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Wednesday 18 March 2026 20:36:36 -0400 (0:00:00.461) 0:18:53.959 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Wednesday 18 March 2026 20:36:36 -0400 (0:00:00.427) 0:18:54.387 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Wednesday 18 March 2026 20:36:36 -0400 (0:00:00.338) 0:18:54.725 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Wednesday 18 March 2026 20:36:37 -0400 (0:00:00.209) 0:18:54.935 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Wednesday 18 March 2026 20:36:37 -0400 (0:00:00.290) 0:18:55.226 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 18 March 2026 20:36:37 -0400 (0:00:00.339) 0:18:55.566 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Wednesday 18 March 2026 20:36:38 -0400 (0:00:00.396) 0:18:55.962 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node4 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 18 March 2026 20:36:38 -0400 (0:00:00.538) 0:18:56.501 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node4 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Wednesday 18 March 2026 20:36:39 -0400 (0:00:00.673) 0:18:57.174 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Wednesday 18 March 2026 20:36:39 -0400 (0:00:00.294) 0:18:57.469 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Wednesday 18 March 2026 20:36:39 -0400 (0:00:00.176) 0:18:57.646 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Wednesday 18 March 2026 20:36:39 -0400 (0:00:00.195) 0:18:57.842 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Wednesday 18 March 2026 20:36:40 -0400 (0:00:00.170) 0:18:58.013 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Wednesday 18 March 2026 20:36:40 -0400 (0:00:00.268) 0:18:58.281 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Wednesday 18 March 2026 20:36:40 -0400 (0:00:00.251) 0:18:58.533 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Wednesday 18 March 2026 20:36:40 -0400 (0:00:00.280) 0:18:58.813 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node4 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 18 March 2026 20:36:41 -0400 (0:00:00.613) 0:18:59.427 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 18 March 2026 20:36:41 -0400 (0:00:00.141) 0:18:59.569 ******* skipping: [managed-node4] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Wednesday 18 March 2026 20:36:41 -0400 (0:00:00.266) 0:18:59.836 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Wednesday 18 March 2026 20:36:42 -0400 (0:00:00.243) 0:19:00.080 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Wednesday 18 March 2026 20:36:42 -0400 (0:00:00.224) 0:19:00.304 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Wednesday 18 March 2026 20:36:42 -0400 (0:00:00.264) 0:19:00.569 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Wednesday 18 March 2026 20:36:42 -0400 (0:00:00.264) 0:19:00.833 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Wednesday 18 March 2026 20:36:43 -0400 (0:00:00.306) 0:19:01.140 ******* ok: [managed-node4] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 18 March 2026 20:36:43 -0400 (0:00:00.257) 0:19:01.397 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node4 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 18 March 2026 20:36:43 -0400 (0:00:00.413) 0:19:01.811 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 18 March 2026 20:36:44 -0400 (0:00:00.374) 0:19:02.186 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node4 => (item=mount) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node4 => (item=fstab) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node4 => (item=fs) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node4 => (item=device) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node4 => (item=encryption) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node4 => (item=md) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node4 => (item=size) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node4 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 18 March 2026 20:36:46 -0400 (0:00:02.110) 0:19:04.296 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 18 March 2026 20:36:46 -0400 (0:00:00.436) 0:19:04.733 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 18 March 2026 20:36:47 -0400 (0:00:00.380) 0:19:05.113 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 18 March 2026 20:36:47 -0400 (0:00:00.504) 0:19:05.617 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 18 March 2026 20:36:48 -0400 (0:00:00.414) 0:19:06.032 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 18 March 2026 20:36:48 -0400 (0:00:00.394) 0:19:06.426 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 18 March 2026 20:36:48 -0400 (0:00:00.417) 0:19:06.844 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 18 March 2026 20:36:49 -0400 (0:00:00.489) 0:19:07.333 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 18 March 2026 20:36:49 -0400 (0:00:00.219) 0:19:07.553 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 18 March 2026 20:36:49 -0400 (0:00:00.233) 0:19:07.786 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 18 March 2026 20:36:50 -0400 (0:00:00.165) 0:19:07.952 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 18 March 2026 20:36:50 -0400 (0:00:00.303) 0:19:08.256 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 18 March 2026 20:36:51 -0400 (0:00:00.754) 0:19:09.011 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 18 March 2026 20:36:51 -0400 (0:00:00.430) 0:19:09.442 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 18 March 2026 20:36:51 -0400 (0:00:00.329) 0:19:09.772 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 18 March 2026 20:36:52 -0400 (0:00:00.236) 0:19:10.008 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 18 March 2026 20:36:52 -0400 (0:00:00.325) 0:19:10.333 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 18 March 2026 20:36:52 -0400 (0:00:00.224) 0:19:10.558 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 18 March 2026 20:36:53 -0400 (0:00:00.374) 0:19:10.933 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 18 March 2026 20:36:53 -0400 (0:00:00.435) 0:19:11.369 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880552.5263445, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1773880552.5263445, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2434, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1773880552.5263445, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 18 March 2026 20:36:54 -0400 (0:00:01.359) 0:19:12.728 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 18 March 2026 20:36:55 -0400 (0:00:00.351) 0:19:13.080 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 18 March 2026 20:36:55 -0400 (0:00:00.180) 0:19:13.260 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 18 March 2026 20:36:55 -0400 (0:00:00.418) 0:19:13.679 ******* ok: [managed-node4] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 18 March 2026 20:36:56 -0400 (0:00:00.265) 0:19:13.945 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 18 March 2026 20:36:56 -0400 (0:00:00.238) 0:19:14.183 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 18 March 2026 20:36:56 -0400 (0:00:00.309) 0:19:14.493 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 18 March 2026 20:36:56 -0400 (0:00:00.207) 0:19:14.700 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 18 March 2026 20:36:58 -0400 (0:00:02.034) 0:19:16.735 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 18 March 2026 20:36:59 -0400 (0:00:00.214) 0:19:16.950 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 18 March 2026 20:36:59 -0400 (0:00:00.280) 0:19:17.231 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 18 March 2026 20:36:59 -0400 (0:00:00.369) 0:19:17.600 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 18 March 2026 20:36:59 -0400 (0:00:00.231) 0:19:17.832 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 18 March 2026 20:37:00 -0400 (0:00:00.207) 0:19:18.040 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 18 March 2026 20:37:00 -0400 (0:00:00.212) 0:19:18.252 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 18 March 2026 20:37:00 -0400 (0:00:00.158) 0:19:18.410 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 18 March 2026 20:37:00 -0400 (0:00:00.277) 0:19:18.688 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 18 March 2026 20:37:01 -0400 (0:00:00.387) 0:19:19.075 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 18 March 2026 20:37:01 -0400 (0:00:00.323) 0:19:19.399 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 18 March 2026 20:37:01 -0400 (0:00:00.293) 0:19:19.692 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 18 March 2026 20:37:02 -0400 (0:00:00.337) 0:19:20.030 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 18 March 2026 20:37:02 -0400 (0:00:00.317) 0:19:20.347 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 18 March 2026 20:37:02 -0400 (0:00:00.250) 0:19:20.598 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 18 March 2026 20:37:02 -0400 (0:00:00.259) 0:19:20.858 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 18 March 2026 20:37:03 -0400 (0:00:00.242) 0:19:21.100 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 18 March 2026 20:37:03 -0400 (0:00:00.249) 0:19:21.350 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 18 March 2026 20:37:03 -0400 (0:00:00.229) 0:19:21.580 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 18 March 2026 20:37:03 -0400 (0:00:00.266) 0:19:21.850 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 18 March 2026 20:37:04 -0400 (0:00:00.272) 0:19:22.123 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 18 March 2026 20:37:04 -0400 (0:00:00.178) 0:19:22.301 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 18 March 2026 20:37:04 -0400 (0:00:00.224) 0:19:22.525 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 18 March 2026 20:37:04 -0400 (0:00:00.233) 0:19:22.758 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 18 March 2026 20:37:05 -0400 (0:00:00.329) 0:19:23.088 ******* ok: [managed-node4] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 18 March 2026 20:37:06 -0400 (0:00:01.461) 0:19:24.549 ******* ok: [managed-node4] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 18 March 2026 20:37:08 -0400 (0:00:01.448) 0:19:25.998 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 18 March 2026 20:37:08 -0400 (0:00:00.355) 0:19:26.354 ******* ok: [managed-node4] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 18 March 2026 20:37:08 -0400 (0:00:00.294) 0:19:26.648 ******* ok: [managed-node4] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 18 March 2026 20:37:10 -0400 (0:00:01.322) 0:19:27.971 ******* skipping: [managed-node4] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 18 March 2026 20:37:10 -0400 (0:00:00.371) 0:19:28.342 ******* skipping: [managed-node4] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 18 March 2026 20:37:10 -0400 (0:00:00.323) 0:19:28.665 ******* skipping: [managed-node4] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 18 March 2026 20:37:11 -0400 (0:00:00.378) 0:19:29.043 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 18 March 2026 20:37:11 -0400 (0:00:00.318) 0:19:29.362 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 18 March 2026 20:37:11 -0400 (0:00:00.375) 0:19:29.737 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 18 March 2026 20:37:12 -0400 (0:00:00.260) 0:19:29.998 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 18 March 2026 20:37:12 -0400 (0:00:00.409) 0:19:30.408 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 18 March 2026 20:37:12 -0400 (0:00:00.378) 0:19:30.786 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 18 March 2026 20:37:13 -0400 (0:00:00.365) 0:19:31.152 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 18 March 2026 20:37:13 -0400 (0:00:00.316) 0:19:31.468 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 18 March 2026 20:37:13 -0400 (0:00:00.398) 0:19:31.867 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 18 March 2026 20:37:14 -0400 (0:00:00.299) 0:19:32.167 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 18 March 2026 20:37:14 -0400 (0:00:00.292) 0:19:32.459 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 18 March 2026 20:37:14 -0400 (0:00:00.348) 0:19:32.807 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 18 March 2026 20:37:15 -0400 (0:00:00.284) 0:19:33.091 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 18 March 2026 20:37:15 -0400 (0:00:00.355) 0:19:33.446 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 18 March 2026 20:37:15 -0400 (0:00:00.427) 0:19:33.874 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 18 March 2026 20:37:16 -0400 (0:00:00.309) 0:19:34.184 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 18 March 2026 20:37:16 -0400 (0:00:00.464) 0:19:34.649 ******* ok: [managed-node4] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 18 March 2026 20:37:17 -0400 (0:00:00.363) 0:19:35.012 ******* ok: [managed-node4] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 18 March 2026 20:37:17 -0400 (0:00:00.270) 0:19:35.282 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 18 March 2026 20:37:17 -0400 (0:00:00.474) 0:19:35.757 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.030532", "end": "2026-03-18 20:37:18.941188", "rc": 0, "start": "2026-03-18 20:37:18.910656" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 18 March 2026 20:37:19 -0400 (0:00:01.252) 0:19:37.010 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 18 March 2026 20:37:19 -0400 (0:00:00.369) 0:19:37.379 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 18 March 2026 20:37:19 -0400 (0:00:00.355) 0:19:37.735 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 18 March 2026 20:37:20 -0400 (0:00:00.234) 0:19:37.969 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 18 March 2026 20:37:20 -0400 (0:00:00.211) 0:19:38.180 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 18 March 2026 20:37:20 -0400 (0:00:00.239) 0:19:38.419 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 18 March 2026 20:37:20 -0400 (0:00:00.189) 0:19:38.609 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 18 March 2026 20:37:20 -0400 (0:00:00.228) 0:19:38.837 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 18 March 2026 20:37:21 -0400 (0:00:00.253) 0:19:39.091 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 18 March 2026 20:37:21 -0400 (0:00:00.208) 0:19:39.300 ******* changed: [managed-node4] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 6] ****************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:473 Wednesday 18 March 2026 20:37:22 -0400 (0:00:01.372) 0:19:40.673 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node4 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 18 March 2026 20:37:23 -0400 (0:00:00.656) 0:19:41.329 ******* ok: [managed-node4] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 18 March 2026 20:37:23 -0400 (0:00:00.345) 0:19:41.675 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 TASK [Clear facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:10 Wednesday 18 March 2026 20:37:24 -0400 (0:00:00.272) 0:19:41.951 ******* META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:37:24 -0400 (0:00:00.022) 0:19:41.974 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:37:24 -0400 (0:00:00.196) 0:19:42.170 ******* included: fedora.linux_system_roles.storage for managed-node4 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:37:24 -0400 (0:00:00.419) 0:19:42.590 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:37:25 -0400 (0:00:00.356) 0:19:42.946 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:37:27 -0400 (0:00:02.089) 0:19:45.035 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:37:27 -0400 (0:00:00.743) 0:19:45.779 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:37:28 -0400 (0:00:00.350) 0:19:46.130 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:37:28 -0400 (0:00:00.280) 0:19:46.410 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:37:28 -0400 (0:00:00.213) 0:19:46.624 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:37:28 -0400 (0:00:00.230) 0:19:46.854 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:37:29 -0400 (0:00:00.746) 0:19:47.601 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:37:31 -0400 (0:00:02.027) 0:19:49.629 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:37:32 -0400 (0:00:00.353) 0:19:49.982 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:37:32 -0400 (0:00:00.324) 0:19:50.307 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:37:34 -0400 (0:00:02.402) 0:19:52.724 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:37:35 -0400 (0:00:00.555) 0:19:53.299 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:37:35 -0400 (0:00:00.162) 0:19:53.462 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:37:35 -0400 (0:00:00.379) 0:19:53.842 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:37:36 -0400 (0:00:00.154) 0:19:53.997 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:37:38 -0400 (0:00:01.974) 0:19:55.972 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service": { "name": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:37:41 -0400 (0:00:02.940) 0:19:58.913 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2de3dc7518\x2d65da\x2d4f95\x2db3a9\x2dfe385f43a1b3.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "name": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "\"dev-mapper-foo\\\\x2dtest1.device\" \"system-systemd\\\\x2dcryptsetup.slice\" systemd-journald.socket systemd-udevd-kernel.socket cryptsetup-pre.target", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "\"blockdev@dev-mapper-luks\\\\x2de3dc7518\\\\x2d65da\\\\x2d4f95\\\\x2db3a9\\\\x2dfe385f43a1b3.target\" umount.target cryptsetup.target", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3 /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2de3dc7518\\\\x2d65da\\\\x2d4f95\\\\x2db3a9\\\\x2dfe385f43a1b3.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Wed 2026-03-18 20:33:35 EDT", "StateChangeTimestampMonotonic": "2472809262", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2de3dc7518\\\\x2d65da\\\\x2d4f95\\\\x2db3a9\\\\x2dfe385f43a1b3.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:37:42 -0400 (0:00:01.847) 0:20:00.760 ******* fatal: [managed-node4]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111 Wednesday 18 March 2026 20:37:45 -0400 (0:00:02.526) 0:20:03.287 ******* fatal: [managed-node4]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:37:45 -0400 (0:00:00.346) 0:20:03.633 ******* changed: [managed-node4] => (item=systemd-cryptsetup@luks\x2de3dc7518\x2d65da\x2d4f95\x2db3a9\x2dfe385f43a1b3.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "name": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454168576", "LimitMEMLOCKSoft": "454168576", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13684", "LimitNPROCSoft": "13684", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13684", "LimitSIGPENDINGSoft": "13684", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2de3dc7518\\x2d65da\\x2d4f95\\x2db3a9\\x2dfe385f43a1b3.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2de3dc7518\\\\x2d65da\\\\x2d4f95\\\\x2db3a9\\\\x2dfe385f43a1b3.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21894", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Wednesday 18 March 2026 20:37:49 -0400 (0:00:03.311) 0:20:06.945 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Wednesday 18 March 2026 20:37:52 -0400 (0:00:03.471) 0:20:10.417 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Wednesday 18 March 2026 20:37:52 -0400 (0:00:00.342) 0:20:10.759 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 18 March 2026 20:37:53 -0400 (0:00:00.316) 0:20:11.076 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880642.6275249, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1773880642.6275249, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1773880642.6275249, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2615408141", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 18 March 2026 20:37:54 -0400 (0:00:01.205) 0:20:12.282 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume - 3] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:497 Wednesday 18 March 2026 20:37:54 -0400 (0:00:00.393) 0:20:12.675 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 TASK [Clear facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:10 Wednesday 18 March 2026 20:37:55 -0400 (0:00:00.612) 0:20:13.288 ******* META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:37:55 -0400 (0:00:00.022) 0:20:13.310 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:37:55 -0400 (0:00:00.275) 0:20:13.587 ******* included: fedora.linux_system_roles.storage for managed-node4 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:37:56 -0400 (0:00:00.388) 0:20:13.975 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:37:56 -0400 (0:00:00.279) 0:20:14.254 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:37:58 -0400 (0:00:02.235) 0:20:16.489 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:37:59 -0400 (0:00:00.653) 0:20:17.142 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:37:59 -0400 (0:00:00.252) 0:20:17.395 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:37:59 -0400 (0:00:00.274) 0:20:17.669 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:38:00 -0400 (0:00:00.225) 0:20:17.894 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:38:00 -0400 (0:00:00.205) 0:20:18.100 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:38:00 -0400 (0:00:00.607) 0:20:18.707 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:38:02 -0400 (0:00:01.829) 0:20:20.537 ******* ok: [managed-node4] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_luks_version": "luks2", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:38:02 -0400 (0:00:00.232) 0:20:20.769 ******* ok: [managed-node4] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:38:03 -0400 (0:00:00.205) 0:20:20.974 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:38:05 -0400 (0:00:02.403) 0:20:23.378 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:38:05 -0400 (0:00:00.304) 0:20:23.683 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:38:05 -0400 (0:00:00.143) 0:20:23.826 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:38:06 -0400 (0:00:00.201) 0:20:24.028 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:38:06 -0400 (0:00:00.164) 0:20:24.192 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:38:08 -0400 (0:00:01.707) 0:20:25.900 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:38:10 -0400 (0:00:02.913) 0:20:28.813 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:38:11 -0400 (0:00:00.415) 0:20:29.229 ******* changed: [managed-node4] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-5c906400-6b56-43ed-b172-7741e655a365", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 20:38:23 -0400 (0:00:12.091) 0:20:41.321 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 20:38:23 -0400 (0:00:00.243) 0:20:41.565 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880563.0403655, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "2231eacef203373f51099c3294bb0b2cf54fbad7", "ctime": 1773880563.0373654, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 171966659, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773880563.0373654, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1458, "uid": 0, "version": "2477030431", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 20:38:24 -0400 (0:00:01.069) 0:20:42.634 ******* ok: [managed-node4] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:38:25 -0400 (0:00:01.081) 0:20:43.715 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 20:38:26 -0400 (0:00:00.513) 0:20:44.229 ******* ok: [managed-node4] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-5c906400-6b56-43ed-b172-7741e655a365", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 20:38:26 -0400 (0:00:00.314) 0:20:44.543 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 20:38:26 -0400 (0:00:00.212) 0:20:44.755 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 20:38:27 -0400 (0:00:00.255) 0:20:45.011 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node4] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 20:38:28 -0400 (0:00:01.152) 0:20:46.164 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 20:38:29 -0400 (0:00:01.350) 0:20:47.514 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node4] => (item={'src': '/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 20:38:31 -0400 (0:00:01.463) 0:20:48.978 ******* skipping: [managed-node4] => (item={'src': '/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 20:38:31 -0400 (0:00:00.261) 0:20:49.240 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 20:38:32 -0400 (0:00:01.149) 0:20:50.389 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880574.220388, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1773880567.9663754, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 88080581, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1773880567.9677742, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "2820693813", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 20:38:33 -0400 (0:00:01.062) 0:20:51.452 ******* changed: [managed-node4] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-5c906400-6b56-43ed-b172-7741e655a365', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-5c906400-6b56-43ed-b172-7741e655a365", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 20:38:34 -0400 (0:00:01.168) 0:20:52.620 ******* ok: [managed-node4] TASK [Verify role results - 10] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:513 Wednesday 18 March 2026 20:38:36 -0400 (0:00:01.942) 0:20:54.562 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node4 TASK [Print out pool information] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 18 March 2026 20:38:37 -0400 (0:00:00.563) 0:20:55.126 ******* ok: [managed-node4] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 18 March 2026 20:38:37 -0400 (0:00:00.363) 0:20:55.490 ******* skipping: [managed-node4] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 18 March 2026 20:38:37 -0400 (0:00:00.251) 0:20:55.742 ******* ok: [managed-node4] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "5c906400-6b56-43ed-b172-7741e655a365" }, "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "size": "4G", "type": "crypt", "uuid": "75820a36-0b49-49e0-b2a5-a58a60a80462" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "67DFfz-RcXl-0Z5W-fL92-nvtr-VU62-rytR2b" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "63c9c93d-438e-4c50-886b-6f947888611e" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 18 March 2026 20:38:38 -0400 (0:00:01.033) 0:20:56.775 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:01.004011", "end": "2026-03-18 20:38:40.751991", "rc": 0, "start": "2026-03-18 20:38:39.747980" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Mar 10 07:51:51 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=63c9c93d-438e-4c50-886b-6f947888611e / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 18 March 2026 20:38:40 -0400 (0:00:02.011) 0:20:58.786 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:01.004201", "end": "2026-03-18 20:38:42.706683", "failed_when_result": false, "rc": 0, "start": "2026-03-18 20:38:41.702482" } STDOUT: luks-5c906400-6b56-43ed-b172-7741e655a365 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 18 March 2026 20:38:42 -0400 (0:00:02.004) 0:21:00.791 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node4 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 18 March 2026 20:38:43 -0400 (0:00:00.453) 0:21:01.245 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 18 March 2026 20:38:43 -0400 (0:00:00.198) 0:21:01.443 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.030906", "end": "2026-03-18 20:38:44.450389", "rc": 0, "start": "2026-03-18 20:38:44.419483" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 18 March 2026 20:38:44 -0400 (0:00:01.010) 0:21:02.453 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 18 March 2026 20:38:44 -0400 (0:00:00.216) 0:21:02.670 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node4 => (item=members) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node4 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 18 March 2026 20:38:45 -0400 (0:00:00.450) 0:21:03.122 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 18 March 2026 20:38:45 -0400 (0:00:00.327) 0:21:03.450 ******* ok: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 18 March 2026 20:38:46 -0400 (0:00:01.087) 0:21:04.537 ******* ok: [managed-node4] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 18 March 2026 20:38:46 -0400 (0:00:00.302) 0:21:04.839 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 18 March 2026 20:38:47 -0400 (0:00:00.303) 0:21:05.142 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 18 March 2026 20:38:47 -0400 (0:00:00.347) 0:21:05.490 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 18 March 2026 20:38:47 -0400 (0:00:00.209) 0:21:05.706 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 18 March 2026 20:38:48 -0400 (0:00:00.284) 0:21:05.990 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Wednesday 18 March 2026 20:38:48 -0400 (0:00:00.297) 0:21:06.287 ******* ok: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Wednesday 18 March 2026 20:38:48 -0400 (0:00:00.346) 0:21:06.635 ******* ok: [managed-node4] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:115589): WARNING **: 20:38:49.542: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.5 27 Jan 2026 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.221 originally 10.31.13.221 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.13.221 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.13.221 originally 10.31.13.221 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/e0f745688c' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.13.221 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Wednesday 18 March 2026 20:38:49 -0400 (0:00:01.030) 0:21:07.665 ******* skipping: [managed-node4] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "storage_test_pool.grow_to_fill | bool", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Wednesday 18 March 2026 20:38:50 -0400 (0:00:00.262) 0:21:07.928 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node4 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 18 March 2026 20:38:50 -0400 (0:00:00.502) 0:21:08.431 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 18 March 2026 20:38:50 -0400 (0:00:00.211) 0:21:08.642 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 18 March 2026 20:38:50 -0400 (0:00:00.210) 0:21:08.852 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 18 March 2026 20:38:51 -0400 (0:00:00.194) 0:21:09.047 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 18 March 2026 20:38:51 -0400 (0:00:00.226) 0:21:09.273 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 18 March 2026 20:38:51 -0400 (0:00:00.245) 0:21:09.519 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 18 March 2026 20:38:51 -0400 (0:00:00.247) 0:21:09.786 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 18 March 2026 20:38:52 -0400 (0:00:00.214) 0:21:10.001 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 18 March 2026 20:38:52 -0400 (0:00:00.179) 0:21:10.181 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 18 March 2026 20:38:52 -0400 (0:00:00.284) 0:21:10.466 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 18 March 2026 20:38:52 -0400 (0:00:00.249) 0:21:10.715 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Wednesday 18 March 2026 20:38:53 -0400 (0:00:00.247) 0:21:10.962 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node4 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 18 March 2026 20:38:53 -0400 (0:00:00.772) 0:21:11.735 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node4 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Wednesday 18 March 2026 20:38:54 -0400 (0:00:00.515) 0:21:12.250 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Wednesday 18 March 2026 20:38:54 -0400 (0:00:00.341) 0:21:12.592 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Wednesday 18 March 2026 20:38:55 -0400 (0:00:00.362) 0:21:12.955 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Wednesday 18 March 2026 20:38:55 -0400 (0:00:00.415) 0:21:13.370 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Wednesday 18 March 2026 20:38:55 -0400 (0:00:00.335) 0:21:13.706 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Wednesday 18 March 2026 20:38:56 -0400 (0:00:00.300) 0:21:14.006 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Wednesday 18 March 2026 20:38:56 -0400 (0:00:00.382) 0:21:14.389 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Wednesday 18 March 2026 20:38:56 -0400 (0:00:00.435) 0:21:14.825 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node4 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 18 March 2026 20:38:57 -0400 (0:00:00.641) 0:21:15.466 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node4 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Wednesday 18 March 2026 20:38:58 -0400 (0:00:00.633) 0:21:16.100 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Wednesday 18 March 2026 20:38:58 -0400 (0:00:00.247) 0:21:16.347 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Wednesday 18 March 2026 20:38:58 -0400 (0:00:00.259) 0:21:16.607 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Wednesday 18 March 2026 20:38:58 -0400 (0:00:00.205) 0:21:16.812 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Wednesday 18 March 2026 20:38:59 -0400 (0:00:00.345) 0:21:17.158 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node4 TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 18 March 2026 20:39:00 -0400 (0:00:00.886) 0:21:18.044 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 18 March 2026 20:39:00 -0400 (0:00:00.270) 0:21:18.314 ******* skipping: [managed-node4] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 18 March 2026 20:39:00 -0400 (0:00:00.417) 0:21:18.731 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node4 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Wednesday 18 March 2026 20:39:01 -0400 (0:00:00.419) 0:21:19.151 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Wednesday 18 March 2026 20:39:01 -0400 (0:00:00.501) 0:21:19.653 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Wednesday 18 March 2026 20:39:02 -0400 (0:00:00.375) 0:21:20.028 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Wednesday 18 March 2026 20:39:02 -0400 (0:00:00.308) 0:21:20.336 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Wednesday 18 March 2026 20:39:02 -0400 (0:00:00.217) 0:21:20.554 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Wednesday 18 March 2026 20:39:02 -0400 (0:00:00.210) 0:21:20.765 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 18 March 2026 20:39:03 -0400 (0:00:00.287) 0:21:21.052 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Wednesday 18 March 2026 20:39:03 -0400 (0:00:00.255) 0:21:21.308 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node4 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 18 March 2026 20:39:04 -0400 (0:00:00.680) 0:21:21.988 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node4 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Wednesday 18 March 2026 20:39:04 -0400 (0:00:00.471) 0:21:22.460 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Wednesday 18 March 2026 20:39:04 -0400 (0:00:00.227) 0:21:22.687 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Wednesday 18 March 2026 20:39:04 -0400 (0:00:00.167) 0:21:22.854 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Wednesday 18 March 2026 20:39:05 -0400 (0:00:00.199) 0:21:23.054 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Wednesday 18 March 2026 20:39:05 -0400 (0:00:00.265) 0:21:23.319 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Wednesday 18 March 2026 20:39:05 -0400 (0:00:00.252) 0:21:23.572 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Wednesday 18 March 2026 20:39:05 -0400 (0:00:00.237) 0:21:23.810 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Wednesday 18 March 2026 20:39:06 -0400 (0:00:00.365) 0:21:24.175 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node4 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 18 March 2026 20:39:07 -0400 (0:00:00.887) 0:21:25.063 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 18 March 2026 20:39:07 -0400 (0:00:00.309) 0:21:25.372 ******* skipping: [managed-node4] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Wednesday 18 March 2026 20:39:07 -0400 (0:00:00.300) 0:21:25.672 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Wednesday 18 March 2026 20:39:08 -0400 (0:00:00.236) 0:21:25.909 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Wednesday 18 March 2026 20:39:08 -0400 (0:00:00.271) 0:21:26.181 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Wednesday 18 March 2026 20:39:08 -0400 (0:00:00.221) 0:21:26.402 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Wednesday 18 March 2026 20:39:08 -0400 (0:00:00.230) 0:21:26.633 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Wednesday 18 March 2026 20:39:09 -0400 (0:00:00.269) 0:21:26.902 ******* ok: [managed-node4] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 18 March 2026 20:39:09 -0400 (0:00:00.293) 0:21:27.196 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node4 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 18 March 2026 20:39:09 -0400 (0:00:00.626) 0:21:27.822 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 18 March 2026 20:39:10 -0400 (0:00:00.356) 0:21:28.179 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node4 => (item=mount) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node4 => (item=fstab) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node4 => (item=fs) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node4 => (item=device) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node4 => (item=encryption) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node4 => (item=md) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node4 => (item=size) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node4 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 18 March 2026 20:39:12 -0400 (0:00:02.113) 0:21:30.292 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 18 March 2026 20:39:12 -0400 (0:00:00.546) 0:21:30.839 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 18 March 2026 20:39:13 -0400 (0:00:00.408) 0:21:31.247 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 18 March 2026 20:39:13 -0400 (0:00:00.506) 0:21:31.754 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 18 March 2026 20:39:14 -0400 (0:00:00.359) 0:21:32.113 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 18 March 2026 20:39:14 -0400 (0:00:00.420) 0:21:32.534 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 18 March 2026 20:39:15 -0400 (0:00:00.488) 0:21:33.023 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 18 March 2026 20:39:15 -0400 (0:00:00.478) 0:21:33.501 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 18 March 2026 20:39:15 -0400 (0:00:00.268) 0:21:33.769 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 18 March 2026 20:39:16 -0400 (0:00:00.271) 0:21:34.041 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 18 March 2026 20:39:16 -0400 (0:00:00.191) 0:21:34.232 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 18 March 2026 20:39:16 -0400 (0:00:00.322) 0:21:34.554 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 18 March 2026 20:39:17 -0400 (0:00:00.770) 0:21:35.325 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 18 March 2026 20:39:17 -0400 (0:00:00.414) 0:21:35.739 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 18 March 2026 20:39:18 -0400 (0:00:00.407) 0:21:36.147 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 18 March 2026 20:39:18 -0400 (0:00:00.289) 0:21:36.437 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 18 March 2026 20:39:18 -0400 (0:00:00.398) 0:21:36.836 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 18 March 2026 20:39:19 -0400 (0:00:00.336) 0:21:37.172 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 18 March 2026 20:39:19 -0400 (0:00:00.448) 0:21:37.621 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 18 March 2026 20:39:20 -0400 (0:00:00.351) 0:21:37.973 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880702.7026453, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1773880702.7026453, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2434, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1773880702.7026453, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 18 March 2026 20:39:21 -0400 (0:00:01.308) 0:21:39.281 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 18 March 2026 20:39:21 -0400 (0:00:00.395) 0:21:39.677 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 18 March 2026 20:39:22 -0400 (0:00:00.279) 0:21:39.956 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 18 March 2026 20:39:22 -0400 (0:00:00.444) 0:21:40.401 ******* ok: [managed-node4] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 18 March 2026 20:39:22 -0400 (0:00:00.318) 0:21:40.720 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 18 March 2026 20:39:23 -0400 (0:00:00.219) 0:21:40.939 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 18 March 2026 20:39:23 -0400 (0:00:00.394) 0:21:41.333 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880703.1766462, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1773880703.1766462, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2510, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1773880703.1766462, "nlink": 1, "path": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 18 March 2026 20:39:24 -0400 (0:00:01.243) 0:21:42.577 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 18 March 2026 20:39:26 -0400 (0:00:01.957) 0:21:44.535 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.007385", "end": "2026-03-18 20:39:27.897333", "rc": 0, "start": "2026-03-18 20:39:27.889948" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 5c906400-6b56-43ed-b172-7741e655a365 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 683505 Threads: 2 Salt: e1 4e 36 c8 d5 ff 89 68 8b 22 c6 fe 47 1a 1f 6e a3 60 50 d0 b6 56 35 ae d2 71 ae 2d 60 78 1b 5a AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 131863 Salt: 67 a3 2a 48 ee 8e d5 99 9a ea e2 a7 d8 d0 e8 21 26 bf ca e0 51 59 b2 74 a9 3d 52 ac d7 b6 c5 19 Digest: 74 a5 88 18 ee 90 4c 60 43 37 68 1f 50 08 12 77 80 db 01 fb 37 ca c1 fc ff 66 c9 6e fb 2d cd 76 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 18 March 2026 20:39:28 -0400 (0:00:01.458) 0:21:45.993 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 18 March 2026 20:39:28 -0400 (0:00:00.297) 0:21:46.291 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 18 March 2026 20:39:28 -0400 (0:00:00.204) 0:21:46.495 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 18 March 2026 20:39:28 -0400 (0:00:00.267) 0:21:46.763 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 18 March 2026 20:39:29 -0400 (0:00:00.240) 0:21:47.003 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 18 March 2026 20:39:29 -0400 (0:00:00.522) 0:21:47.526 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 18 March 2026 20:39:30 -0400 (0:00:00.380) 0:21:47.906 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 18 March 2026 20:39:32 -0400 (0:00:02.032) 0:21:49.939 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-5c906400-6b56-43ed-b172-7741e655a365 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 18 March 2026 20:39:32 -0400 (0:00:00.529) 0:21:50.469 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 18 March 2026 20:39:32 -0400 (0:00:00.387) 0:21:50.856 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 18 March 2026 20:39:33 -0400 (0:00:00.452) 0:21:51.309 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 18 March 2026 20:39:33 -0400 (0:00:00.496) 0:21:51.805 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 18 March 2026 20:39:34 -0400 (0:00:00.367) 0:21:52.173 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 18 March 2026 20:39:34 -0400 (0:00:00.299) 0:21:52.473 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 18 March 2026 20:39:34 -0400 (0:00:00.235) 0:21:52.708 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 18 March 2026 20:39:35 -0400 (0:00:00.209) 0:21:52.917 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 18 March 2026 20:39:35 -0400 (0:00:00.253) 0:21:53.171 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 18 March 2026 20:39:35 -0400 (0:00:00.348) 0:21:53.519 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 18 March 2026 20:39:35 -0400 (0:00:00.199) 0:21:53.719 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 18 March 2026 20:39:36 -0400 (0:00:00.450) 0:21:54.170 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 18 March 2026 20:39:36 -0400 (0:00:00.224) 0:21:54.395 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 18 March 2026 20:39:36 -0400 (0:00:00.250) 0:21:54.673 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 18 March 2026 20:39:37 -0400 (0:00:00.225) 0:21:54.899 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 18 March 2026 20:39:37 -0400 (0:00:00.259) 0:21:55.159 ******* ok: [managed-node4] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 18 March 2026 20:39:38 -0400 (0:00:01.274) 0:21:56.433 ******* ok: [managed-node4] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 18 March 2026 20:39:39 -0400 (0:00:01.130) 0:21:57.564 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 18 March 2026 20:39:40 -0400 (0:00:00.391) 0:21:57.955 ******* ok: [managed-node4] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 18 March 2026 20:39:40 -0400 (0:00:00.348) 0:21:58.304 ******* ok: [managed-node4] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 18 March 2026 20:39:41 -0400 (0:00:01.492) 0:21:59.796 ******* skipping: [managed-node4] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 18 March 2026 20:39:42 -0400 (0:00:00.362) 0:22:00.159 ******* skipping: [managed-node4] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 18 March 2026 20:39:42 -0400 (0:00:00.244) 0:22:00.403 ******* skipping: [managed-node4] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 18 March 2026 20:39:42 -0400 (0:00:00.435) 0:22:00.838 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 18 March 2026 20:39:43 -0400 (0:00:00.338) 0:22:01.176 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 18 March 2026 20:39:43 -0400 (0:00:00.432) 0:22:01.608 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 18 March 2026 20:39:44 -0400 (0:00:00.392) 0:22:02.000 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 18 March 2026 20:39:44 -0400 (0:00:00.460) 0:22:02.461 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 18 March 2026 20:39:45 -0400 (0:00:00.427) 0:22:02.888 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 18 March 2026 20:39:45 -0400 (0:00:00.364) 0:22:03.253 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 18 March 2026 20:39:45 -0400 (0:00:00.372) 0:22:03.626 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 18 March 2026 20:39:46 -0400 (0:00:00.348) 0:22:03.974 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 18 March 2026 20:39:46 -0400 (0:00:00.387) 0:22:04.362 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 18 March 2026 20:39:46 -0400 (0:00:00.283) 0:22:04.658 ******* skipping: [managed-node4] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 18 March 2026 20:39:47 -0400 (0:00:00.312) 0:22:04.970 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 18 March 2026 20:39:47 -0400 (0:00:00.618) 0:22:05.589 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 18 March 2026 20:39:48 -0400 (0:00:00.321) 0:22:05.910 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 18 March 2026 20:39:48 -0400 (0:00:00.326) 0:22:06.236 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 18 March 2026 20:39:48 -0400 (0:00:00.292) 0:22:06.529 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 18 March 2026 20:39:49 -0400 (0:00:00.411) 0:22:06.941 ******* ok: [managed-node4] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 18 March 2026 20:39:49 -0400 (0:00:00.339) 0:22:07.280 ******* ok: [managed-node4] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 18 March 2026 20:39:49 -0400 (0:00:00.298) 0:22:07.579 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 18 March 2026 20:39:50 -0400 (0:00:00.442) 0:22:08.022 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.032446", "end": "2026-03-18 20:39:51.186121", "rc": 0, "start": "2026-03-18 20:39:51.153675" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 18 March 2026 20:39:51 -0400 (0:00:01.231) 0:22:09.253 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 18 March 2026 20:39:51 -0400 (0:00:00.367) 0:22:09.621 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 18 March 2026 20:39:52 -0400 (0:00:00.398) 0:22:10.019 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 18 March 2026 20:39:52 -0400 (0:00:00.262) 0:22:10.281 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 18 March 2026 20:39:52 -0400 (0:00:00.326) 0:22:10.608 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 18 March 2026 20:39:53 -0400 (0:00:00.303) 0:22:10.911 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 18 March 2026 20:39:53 -0400 (0:00:00.260) 0:22:11.172 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 18 March 2026 20:39:53 -0400 (0:00:00.384) 0:22:11.556 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 18 March 2026 20:39:53 -0400 (0:00:00.218) 0:22:11.775 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:516 Wednesday 18 March 2026 20:39:54 -0400 (0:00:00.318) 0:22:12.093 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node4 TASK [Clear facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:10 Wednesday 18 March 2026 20:39:54 -0400 (0:00:00.702) 0:22:12.796 ******* META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:24 Wednesday 18 March 2026 20:39:54 -0400 (0:00:00.042) 0:22:12.838 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:34 Wednesday 18 March 2026 20:39:55 -0400 (0:00:00.271) 0:22:13.109 ******* included: fedora.linux_system_roles.storage for managed-node4 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 18 March 2026 20:39:55 -0400 (0:00:00.389) 0:22:13.499 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 18 March 2026 20:39:55 -0400 (0:00:00.240) 0:22:13.739 ******* ok: [managed-node4] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 18 March 2026 20:39:58 -0400 (0:00:02.413) 0:22:16.153 ******* skipping: [managed-node4] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node4] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node4] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 18 March 2026 20:39:59 -0400 (0:00:00.735) 0:22:16.893 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 18 March 2026 20:39:59 -0400 (0:00:00.307) 0:22:17.201 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 18 March 2026 20:39:59 -0400 (0:00:00.325) 0:22:17.527 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 18 March 2026 20:39:59 -0400 (0:00:00.300) 0:22:17.827 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 18 March 2026 20:40:00 -0400 (0:00:00.244) 0:22:18.071 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 18 March 2026 20:40:00 -0400 (0:00:00.585) 0:22:18.656 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 18 March 2026 20:40:02 -0400 (0:00:01.905) 0:22:20.561 ******* ok: [managed-node4] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 18 March 2026 20:40:02 -0400 (0:00:00.301) 0:22:20.863 ******* ok: [managed-node4] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 18 March 2026 20:40:03 -0400 (0:00:00.233) 0:22:21.096 ******* ok: [managed-node4] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Wednesday 18 March 2026 20:40:05 -0400 (0:00:02.061) 0:22:23.158 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node4 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 18 March 2026 20:40:05 -0400 (0:00:00.518) 0:22:23.676 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 18 March 2026 20:40:06 -0400 (0:00:00.223) 0:22:23.900 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 18 March 2026 20:40:06 -0400 (0:00:00.217) 0:22:24.117 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 Wednesday 18 March 2026 20:40:06 -0400 (0:00:00.155) 0:22:24.273 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Wednesday 18 March 2026 20:40:08 -0400 (0:00:01.853) 0:22:26.126 ******* ok: [managed-node4] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:64 Wednesday 18 March 2026 20:40:11 -0400 (0:00:02.862) 0:22:28.989 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Wednesday 18 March 2026 20:40:11 -0400 (0:00:00.506) 0:22:29.495 ******* changed: [managed-node4] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-5c906400-6b56-43ed-b172-7741e655a365", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=67DFfz-RcXl-0Z5W-fL92-nvtr-VU62-rytR2b", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:85 Wednesday 18 March 2026 20:40:15 -0400 (0:00:03.398) 0:22:32.894 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:92 Wednesday 18 March 2026 20:40:15 -0400 (0:00:00.298) 0:22:33.193 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880710.9546618, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "46ab08bc7a24b45db8b19a5e308654cfb7341b2a", "ctime": 1773880710.9516618, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 171966659, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1773880710.9516618, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1490, "uid": 0, "version": "2477030431", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:97 Wednesday 18 March 2026 20:40:16 -0400 (0:00:01.376) 0:22:34.569 ******* ok: [managed-node4] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Wednesday 18 March 2026 20:40:17 -0400 (0:00:01.294) 0:22:35.864 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:121 Wednesday 18 March 2026 20:40:18 -0400 (0:00:00.442) 0:22:36.306 ******* ok: [managed-node4] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-5c906400-6b56-43ed-b172-7741e655a365", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=67DFfz-RcXl-0Z5W-fL92-nvtr-VU62-rytR2b", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:130 Wednesday 18 March 2026 20:40:18 -0400 (0:00:00.372) 0:22:36.679 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:134 Wednesday 18 March 2026 20:40:19 -0400 (0:00:00.264) 0:22:36.943 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=67DFfz-RcXl-0Z5W-fL92-nvtr-VU62-rytR2b", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:150 Wednesday 18 March 2026 20:40:19 -0400 (0:00:00.350) 0:22:37.294 ******* redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node4] => (item={'src': '/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-5c906400-6b56-43ed-b172-7741e655a365" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 Wednesday 18 March 2026 20:40:20 -0400 (0:00:01.373) 0:22:38.668 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:166 Wednesday 18 March 2026 20:40:22 -0400 (0:00:01.617) 0:22:40.286 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:177 Wednesday 18 March 2026 20:40:22 -0400 (0:00:00.294) 0:22:40.580 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:189 Wednesday 18 March 2026 20:40:22 -0400 (0:00:00.253) 0:22:40.834 ******* ok: [managed-node4] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:197 Wednesday 18 March 2026 20:40:24 -0400 (0:00:01.674) 0:22:42.509 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880721.7046833, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "91ca7c56110fd767c626b58340450cffa0c903d6", "ctime": 1773880714.5876691, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 398459084, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1773880714.5878491, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "1615653956", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:202 Wednesday 18 March 2026 20:40:25 -0400 (0:00:01.171) 0:22:43.681 ******* changed: [managed-node4] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-5c906400-6b56-43ed-b172-7741e655a365', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-5c906400-6b56-43ed-b172-7741e655a365", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:224 Wednesday 18 March 2026 20:40:27 -0400 (0:00:01.339) 0:22:45.021 ******* ok: [managed-node4] TASK [Verify role results - 11] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:525 Wednesday 18 March 2026 20:40:28 -0400 (0:00:01.856) 0:22:46.877 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node4 TASK [Print out pool information] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 18 March 2026 20:40:29 -0400 (0:00:00.583) 0:22:47.461 ******* skipping: [managed-node4] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 18 March 2026 20:40:29 -0400 (0:00:00.162) 0:22:47.624 ******* ok: [managed-node4] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=67DFfz-RcXl-0Z5W-fL92-nvtr-VU62-rytR2b", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 18 March 2026 20:40:30 -0400 (0:00:00.323) 0:22:47.948 ******* ok: [managed-node4] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "63c9c93d-438e-4c50-886b-6f947888611e" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 18 March 2026 20:40:31 -0400 (0:00:01.348) 0:22:49.296 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003685", "end": "2026-03-18 20:40:32.529214", "rc": 0, "start": "2026-03-18 20:40:32.525529" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Mar 10 07:51:51 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=63c9c93d-438e-4c50-886b-6f947888611e / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 18 March 2026 20:40:32 -0400 (0:00:01.322) 0:22:50.619 ******* ok: [managed-node4] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002937", "end": "2026-03-18 20:40:33.744832", "failed_when_result": false, "rc": 0, "start": "2026-03-18 20:40:33.741895" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 18 March 2026 20:40:33 -0400 (0:00:01.209) 0:22:51.828 ******* skipping: [managed-node4] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Wednesday 18 March 2026 20:40:34 -0400 (0:00:00.257) 0:22:52.086 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node4 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'lvmpv', 'mount_options': 'defaults', 'mount_point': None, 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'absent', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/sda', '_raw_device': '/dev/sda', '_mount_id': 'UUID=67DFfz-RcXl-0Z5W-fL92-nvtr-VU62-rytR2b'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 18 March 2026 20:40:34 -0400 (0:00:00.450) 0:22:52.536 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 18 March 2026 20:40:35 -0400 (0:00:00.352) 0:22:52.889 ******* included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node4 => (item=mount) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node4 => (item=fstab) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node4 => (item=fs) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node4 => (item=device) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node4 => (item=encryption) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node4 => (item=md) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node4 => (item=size) included: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node4 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 18 March 2026 20:40:36 -0400 (0:00:01.931) 0:22:54.821 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 18 March 2026 20:40:37 -0400 (0:00:00.273) 0:22:55.094 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 18 March 2026 20:40:37 -0400 (0:00:00.411) 0:22:55.506 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Wednesday 18 March 2026 20:40:37 -0400 (0:00:00.300) 0:22:55.806 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Wednesday 18 March 2026 20:40:38 -0400 (0:00:00.222) 0:22:56.028 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Wednesday 18 March 2026 20:40:38 -0400 (0:00:00.207) 0:22:56.236 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Wednesday 18 March 2026 20:40:38 -0400 (0:00:00.232) 0:22:56.468 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Wednesday 18 March 2026 20:40:38 -0400 (0:00:00.269) 0:22:56.738 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Wednesday 18 March 2026 20:40:39 -0400 (0:00:00.271) 0:22:57.009 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Wednesday 18 March 2026 20:40:39 -0400 (0:00:00.251) 0:22:57.260 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Wednesday 18 March 2026 20:40:39 -0400 (0:00:00.232) 0:22:57.493 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 18 March 2026 20:40:39 -0400 (0:00:00.250) 0:22:57.744 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 18 March 2026 20:40:40 -0400 (0:00:00.724) 0:22:58.468 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 18 March 2026 20:40:40 -0400 (0:00:00.219) 0:22:58.688 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 18 March 2026 20:40:41 -0400 (0:00:00.269) 0:22:58.957 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 18 March 2026 20:40:41 -0400 (0:00:00.225) 0:22:59.183 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Wednesday 18 March 2026 20:40:41 -0400 (0:00:00.326) 0:22:59.509 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 18 March 2026 20:40:41 -0400 (0:00:00.231) 0:22:59.740 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 18 March 2026 20:40:42 -0400 (0:00:00.296) 0:23:00.037 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 18 March 2026 20:40:42 -0400 (0:00:00.241) 0:23:00.279 ******* ok: [managed-node4] => { "changed": false, "stat": { "atime": 1773880814.780871, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1773880814.780871, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 450, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1773880814.780871, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 18 March 2026 20:40:43 -0400 (0:00:01.190) 0:23:01.469 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 18 March 2026 20:40:43 -0400 (0:00:00.358) 0:23:01.828 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 18 March 2026 20:40:44 -0400 (0:00:00.206) 0:23:02.034 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 18 March 2026 20:40:44 -0400 (0:00:00.279) 0:23:02.313 ******* ok: [managed-node4] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 18 March 2026 20:40:44 -0400 (0:00:00.291) 0:23:02.604 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 18 March 2026 20:40:44 -0400 (0:00:00.222) 0:23:02.827 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 18 March 2026 20:40:45 -0400 (0:00:00.226) 0:23:03.054 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 18 March 2026 20:40:45 -0400 (0:00:00.237) 0:23:03.291 ******* ok: [managed-node4] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 18 March 2026 20:40:47 -0400 (0:00:01.904) 0:23:05.196 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 18 March 2026 20:40:47 -0400 (0:00:00.210) 0:23:05.406 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 18 March 2026 20:40:47 -0400 (0:00:00.204) 0:23:05.635 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 18 March 2026 20:40:47 -0400 (0:00:00.217) 0:23:05.853 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 18 March 2026 20:40:48 -0400 (0:00:00.262) 0:23:06.116 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 18 March 2026 20:40:48 -0400 (0:00:00.246) 0:23:06.362 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Wednesday 18 March 2026 20:40:48 -0400 (0:00:00.182) 0:23:06.545 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Wednesday 18 March 2026 20:40:48 -0400 (0:00:00.275) 0:23:06.820 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Wednesday 18 March 2026 20:40:49 -0400 (0:00:00.275) 0:23:07.096 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Wednesday 18 March 2026 20:40:49 -0400 (0:00:00.493) 0:23:07.590 ******* ok: [managed-node4] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Wednesday 18 March 2026 20:40:49 -0400 (0:00:00.250) 0:23:07.840 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Wednesday 18 March 2026 20:40:50 -0400 (0:00:00.173) 0:23:08.013 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Wednesday 18 March 2026 20:40:50 -0400 (0:00:00.105) 0:23:08.119 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Wednesday 18 March 2026 20:40:50 -0400 (0:00:00.103) 0:23:08.223 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 18 March 2026 20:40:50 -0400 (0:00:00.198) 0:23:08.421 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 18 March 2026 20:40:50 -0400 (0:00:00.191) 0:23:08.613 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 18 March 2026 20:40:50 -0400 (0:00:00.209) 0:23:08.822 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 18 March 2026 20:40:51 -0400 (0:00:00.296) 0:23:09.119 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 18 March 2026 20:40:51 -0400 (0:00:00.152) 0:23:09.271 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 18 March 2026 20:40:51 -0400 (0:00:00.188) 0:23:09.460 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 18 March 2026 20:40:51 -0400 (0:00:00.209) 0:23:09.670 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 18 March 2026 20:40:51 -0400 (0:00:00.179) 0:23:09.850 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 18 March 2026 20:40:52 -0400 (0:00:00.232) 0:23:10.083 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 18 March 2026 20:40:52 -0400 (0:00:00.209) 0:23:10.292 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 18 March 2026 20:40:52 -0400 (0:00:00.215) 0:23:10.508 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 18 March 2026 20:40:52 -0400 (0:00:00.204) 0:23:10.713 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 18 March 2026 20:40:53 -0400 (0:00:00.251) 0:23:10.964 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 18 March 2026 20:40:53 -0400 (0:00:00.208) 0:23:11.173 ******* ok: [managed-node4] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 18 March 2026 20:40:53 -0400 (0:00:00.282) 0:23:11.456 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 18 March 2026 20:40:53 -0400 (0:00:00.283) 0:23:11.740 ******* skipping: [managed-node4] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 18 March 2026 20:40:54 -0400 (0:00:00.220) 0:23:11.960 ******* skipping: [managed-node4] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 18 March 2026 20:40:54 -0400 (0:00:00.171) 0:23:12.132 ******* skipping: [managed-node4] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 18 March 2026 20:40:54 -0400 (0:00:00.185) 0:23:12.317 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Wednesday 18 March 2026 20:40:54 -0400 (0:00:00.258) 0:23:12.576 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Wednesday 18 March 2026 20:40:54 -0400 (0:00:00.222) 0:23:12.798 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Wednesday 18 March 2026 20:40:55 -0400 (0:00:00.261) 0:23:13.059 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Wednesday 18 March 2026 20:40:55 -0400 (0:00:00.302) 0:23:13.362 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Wednesday 18 March 2026 20:40:55 -0400 (0:00:00.282) 0:23:13.644 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Wednesday 18 March 2026 20:40:56 -0400 (0:00:00.273) 0:23:13.918 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Wednesday 18 March 2026 20:40:56 -0400 (0:00:00.242) 0:23:14.160 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Wednesday 18 March 2026 20:40:56 -0400 (0:00:00.249) 0:23:14.410 ******* skipping: [managed-node4] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Wednesday 18 March 2026 20:40:56 -0400 (0:00:00.317) 0:23:14.727 ******* skipping: [managed-node4] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Wednesday 18 March 2026 20:40:57 -0400 (0:00:00.273) 0:23:15.000 ******* skipping: [managed-node4] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Wednesday 18 March 2026 20:40:57 -0400 (0:00:00.279) 0:23:15.306 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Wednesday 18 March 2026 20:40:57 -0400 (0:00:00.293) 0:23:15.599 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Wednesday 18 March 2026 20:40:57 -0400 (0:00:00.158) 0:23:15.758 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Wednesday 18 March 2026 20:40:58 -0400 (0:00:00.217) 0:23:15.975 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Wednesday 18 March 2026 20:40:58 -0400 (0:00:00.270) 0:23:16.245 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Wednesday 18 March 2026 20:40:58 -0400 (0:00:00.199) 0:23:16.445 ******* ok: [managed-node4] => { "storage_test_actual_size": { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Wednesday 18 March 2026 20:40:58 -0400 (0:00:00.229) 0:23:16.674 ******* ok: [managed-node4] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Wednesday 18 March 2026 20:40:58 -0400 (0:00:00.096) 0:23:16.771 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 18 March 2026 20:40:59 -0400 (0:00:00.165) 0:23:16.937 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 18 March 2026 20:40:59 -0400 (0:00:00.189) 0:23:17.126 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 18 March 2026 20:40:59 -0400 (0:00:00.238) 0:23:17.365 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 18 March 2026 20:40:59 -0400 (0:00:00.191) 0:23:17.557 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 18 March 2026 20:40:59 -0400 (0:00:00.173) 0:23:17.730 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 18 March 2026 20:40:59 -0400 (0:00:00.140) 0:23:17.870 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 18 March 2026 20:41:00 -0400 (0:00:00.176) 0:23:18.047 ******* skipping: [managed-node4] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 18 March 2026 20:41:00 -0400 (0:00:00.160) 0:23:18.207 ******* ok: [managed-node4] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Wednesday 18 March 2026 20:41:00 -0400 (0:00:00.195) 0:23:18.402 ******* ok: [managed-node4] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } PLAY RECAP ********************************************************************* managed-node4 : ok=1271 changed=60 unreachable=0 failed=0 skipped=1068 rescued=18 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.17.14", "end_time": "2026-03-19T00:18:52.192847+00:00Z", "host": "managed-node4", "message": "encrypted volume 'foo' missing key/password", "start_time": "2026-03-19T00:18:50.033177+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2026-03-19T00:18:52.451730+00:00Z", "host": "managed-node4", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'foo' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-19T00:18:52.220491+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2026-03-19T00:20:49.220942+00:00Z", "host": "managed-node4", "message": "cannot remove existing formatting on device 'luks-2ff7d768-0179-41ab-9b55-845a651e9a17' in safe mode due to encryption removal", "start_time": "2026-03-19T00:20:46.869878+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2026-03-19T00:20:49.654098+00:00Z", "host": "managed-node4", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-2ff7d768-0179-41ab-9b55-845a651e9a17' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-19T00:20:49.314745+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2026-03-19T00:22:24.337472+00:00Z", "host": "managed-node4", "message": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "start_time": "2026-03-19T00:22:22.137733+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2026-03-19T00:22:24.712831+00:00Z", "host": "managed-node4", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-19T00:22:24.399282+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2026-03-19T00:24:16.015825+00:00Z", "host": "managed-node4", "message": "encrypted volume 'test1' missing key/password", "start_time": "2026-03-19T00:24:13.689837+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2026-03-19T00:24:16.470176+00:00Z", "host": "managed-node4", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-19T00:24:16.101640+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2026-03-19T00:26:26.689868+00:00Z", "host": "managed-node4", "message": "cannot remove existing formatting on device 'luks-37471199-8d7b-4b98-b054-dea418ad7607' in safe mode due to encryption removal", "start_time": "2026-03-19T00:26:24.235482+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2026-03-19T00:26:27.087726+00:00Z", "host": "managed-node4", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-37471199-8d7b-4b98-b054-dea418ad7607' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-19T00:26:26.738752+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2026-03-19T00:28:33.821765+00:00Z", "host": "managed-node4", "message": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "start_time": "2026-03-19T00:28:31.431814+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2026-03-19T00:28:34.248743+00:00Z", "host": "managed-node4", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-19T00:28:33.920390+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2026-03-19T00:30:50.793734+00:00Z", "host": "managed-node4", "message": "encrypted volume 'test1' missing key/password", "start_time": "2026-03-19T00:30:48.396249+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2026-03-19T00:30:51.247531+00:00Z", "host": "managed-node4", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-19T00:30:50.852057+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2026-03-19T00:35:23.847719+00:00Z", "host": "managed-node4", "message": "cannot remove existing formatting on device 'luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3' in safe mode due to encryption removal", "start_time": "2026-03-19T00:35:21.518696+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2026-03-19T00:35:24.174864+00:00Z", "host": "managed-node4", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-e3dc7518-65da-4f95-b3a9-fe385f43a1b3' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-19T00:35:23.892132+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" }, { "ansible_version": "2.17.14", "end_time": "2026-03-19T00:37:45.359370+00:00Z", "host": "managed-node4", "message": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "start_time": "2026-03-19T00:37:42.932902+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70" }, { "ansible_version": "2.17.14", "end_time": "2026-03-19T00:37:45.711546+00:00Z", "host": "managed-node4", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-03-19T00:37:45.411034+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:111" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Wednesday 18 March 2026 20:41:00 -0400 (0:00:00.169) 0:23:18.572 ******* =============================================================================== fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.20s /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.53s /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.41s /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.09s /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 11.88s /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 11.30s /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:70 Gathering Facts --------------------------------------------------------- 6.70s /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:2 fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab --- 5.47s /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:161 fedora.linux_system_roles.storage : Get service facts ------------------- 5.13s /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Write the key into the key file ----------------------------------------- 4.99s /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:324 Create a key file ------------------------------------------------------- 4.43s /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks2.yml:317 Get the canonical device path for each member device -------------------- 4.39s /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 fedora.linux_system_roles.storage : Make sure blivet is available ------- 4.12s /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 fedora.linux_system_roles.storage : Get required packages --------------- 4.08s /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Ensure ansible_facts used by role --- 3.97s /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 fedora.linux_system_roles.storage : Get service facts ------------------- 3.93s /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 fedora.linux_system_roles.storage : Make sure required packages are installed --- 3.87s /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:38 fedora.linux_system_roles.storage : Get service facts ------------------- 3.58s /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:58 Find unused disks in the system ----------------------------------------- 3.56s /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Create a file ----------------------------------------------------------- 3.56s /tmp/collections-h00/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12