ansible-playbook [core 2.17.14] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-HLp executable location = /usr/local/bin/ansible-playbook python version = 3.12.12 (main, Mar 9 2026, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-14)] (/usr/bin/python3.12) jinja version = 3.1.6 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles statically imported: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_luks.yml ******************************************************* 1 plays in /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml PLAY [Test LUKS] *************************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:2 Thursday 16 April 2026 19:22:24 -0400 (0:00:00.419) 0:00:00.419 ******** [WARNING]: Platform linux on host managed-node13 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. ok: [managed-node13] TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:20 Thursday 16 April 2026 19:22:31 -0400 (0:00:07.094) 0:00:07.513 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:28 Thursday 16 April 2026 19:22:31 -0400 (0:00:00.276) 0:00:07.790 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Enable FIPS mode - 2] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:39 Thursday 16 April 2026 19:22:32 -0400 (0:00:00.236) 0:00:08.026 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Reboot - 2] ************************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:43 Thursday 16 April 2026 19:22:32 -0400 (0:00:00.214) 0:00:08.241 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Ensure dracut-fips] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:53 Thursday 16 April 2026 19:22:32 -0400 (0:00:00.219) 0:00:08.460 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Configure boot for FIPS] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:59 Thursday 16 April 2026 19:22:32 -0400 (0:00:00.190) 0:00:08.650 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Reboot - 3] ************************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:68 Thursday 16 April 2026 19:22:32 -0400 (0:00:00.226) 0:00:08.876 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "lookup(\"env\", \"SYSTEM_ROLES_TEST_FIPS\") == \"true\"", "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:72 Thursday 16 April 2026 19:22:33 -0400 (0:00:00.294) 0:00:09.170 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node13 TASK [Clear facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Thursday 16 April 2026 19:22:33 -0400 (0:00:00.370) 0:00:09.541 ******** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Thursday 16 April 2026 19:22:33 -0400 (0:00:00.039) 0:00:09.581 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Thursday 16 April 2026 19:22:33 -0400 (0:00:00.311) 0:00:09.892 ******** included: fedora.linux_system_roles.storage for managed-node13 TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 16 April 2026 19:22:35 -0400 (0:00:01.237) 0:00:11.130 ******** ok: [managed-node13] => { "ansible_facts": { "discovered_interpreter_python": "/usr/bin/python3.9" }, "changed": false, "message": "Message written to syslog" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:7 Thursday 16 April 2026 19:22:38 -0400 (0:00:03.440) 0:00:14.571 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 16 April 2026 19:22:38 -0400 (0:00:00.244) 0:00:14.816 ******** ok: [managed-node13] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 16 April 2026 19:22:40 -0400 (0:00:01.711) 0:00:16.528 ******** skipping: [managed-node13] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Thursday 16 April 2026 19:22:41 -0400 (0:00:00.616) 0:00:17.144 ******** ok: [managed-node13] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Thursday 16 April 2026 19:22:43 -0400 (0:00:02.526) 0:00:19.670 ******** ok: [managed-node13] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:10 Thursday 16 April 2026 19:22:44 -0400 (0:00:00.361) 0:00:20.032 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:14 Thursday 16 April 2026 19:22:44 -0400 (0:00:00.182) 0:00:20.215 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:18 Thursday 16 April 2026 19:22:44 -0400 (0:00:00.197) 0:00:20.412 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Thursday 16 April 2026 19:22:45 -0400 (0:00:00.755) 0:00:21.168 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Thursday 16 April 2026 19:22:45 -0400 (0:00:00.254) 0:00:21.422 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Thursday 16 April 2026 19:22:45 -0400 (0:00:00.222) 0:00:21.645 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Thursday 16 April 2026 19:22:49 -0400 (0:00:04.199) 0:00:25.845 ******** ok: [managed-node13] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Thursday 16 April 2026 19:22:50 -0400 (0:00:00.266) 0:00:26.112 ******** ok: [managed-node13] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Thursday 16 April 2026 19:22:50 -0400 (0:00:00.204) 0:00:26.316 ******** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Thursday 16 April 2026 19:22:54 -0400 (0:00:03.994) 0:00:30.311 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 16 April 2026 19:22:54 -0400 (0:00:00.261) 0:00:30.572 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 16 April 2026 19:22:54 -0400 (0:00:00.137) 0:00:30.709 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Thursday 16 April 2026 19:22:54 -0400 (0:00:00.169) 0:00:30.879 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Thursday 16 April 2026 19:22:55 -0400 (0:00:00.161) 0:00:31.040 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Thursday 16 April 2026 19:22:56 -0400 (0:00:01.906) 0:00:32.947 ******** ok: [managed-node13] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Thursday 16 April 2026 19:23:01 -0400 (0:00:04.150) 0:00:37.098 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Thursday 16 April 2026 19:23:01 -0400 (0:00:00.403) 0:00:37.501 ******** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Thursday 16 April 2026 19:23:03 -0400 (0:00:01.666) 0:00:39.168 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Thursday 16 April 2026 19:23:03 -0400 (0:00:00.171) 0:00:39.339 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776381690.0546222, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "00005ca9454e591399a89ce4ad24e534aeafadc6", "ctime": 1776381688.7176218, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 218104009, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776381688.7176218, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1408, "uid": 0, "version": "292099060", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Thursday 16 April 2026 19:23:04 -0400 (0:00:01.183) 0:00:40.522 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "blivet_output is changed", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Thursday 16 April 2026 19:23:04 -0400 (0:00:00.195) 0:00:40.718 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Thursday 16 April 2026 19:23:05 -0400 (0:00:00.411) 0:00:41.130 ******** ok: [managed-node13] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 16 April 2026 19:23:05 -0400 (0:00:00.257) 0:00:41.387 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Thursday 16 April 2026 19:23:05 -0400 (0:00:00.240) 0:00:41.628 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Thursday 16 April 2026 19:23:05 -0400 (0:00:00.294) 0:00:41.923 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Thursday 16 April 2026 19:23:06 -0400 (0:00:00.205) 0:00:42.129 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "blivet_output['mounts'] | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Thursday 16 April 2026 19:23:06 -0400 (0:00:00.176) 0:00:42.305 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Thursday 16 April 2026 19:23:06 -0400 (0:00:00.249) 0:00:42.555 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Thursday 16 April 2026 19:23:06 -0400 (0:00:00.239) 0:00:42.794 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "blivet_output['mounts'] | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Thursday 16 April 2026 19:23:07 -0400 (0:00:00.248) 0:00:43.043 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776380452.4972835, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776150304.367, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 4194436, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1776149993.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "2057026748", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Thursday 16 April 2026 19:23:08 -0400 (0:00:01.242) 0:00:44.285 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Thursday 16 April 2026 19:23:08 -0400 (0:00:00.128) 0:00:44.419 ******** ok: [managed-node13] TASK [Get unused disks] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:75 Thursday 16 April 2026 19:23:10 -0400 (0:00:01.861) 0:00:46.280 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed-node13 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Thursday 16 April 2026 19:23:10 -0400 (0:00:00.471) 0:00:46.752 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Thursday 16 April 2026 19:23:12 -0400 (0:00:01.988) 0:00:48.741 ******** ok: [managed-node13] => { "changed": false, "disks": [ "sda" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG_SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG_SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"xfs\" LOG_SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'type': 'disk', 'size': '268435456000', 'fstype': '', 'ssize': '512'}] has partitions" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Thursday 16 April 2026 19:23:15 -0400 (0:00:03.175) 0:00:51.916 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "'Unable to find unused disk' in unused_disks_return.disks", "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Thursday 16 April 2026 19:23:16 -0400 (0:00:00.178) 0:00:52.095 ******** ok: [managed-node13] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Thursday 16 April 2026 19:23:16 -0400 (0:00:00.328) 0:00:52.424 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "unused_disks | d([]) | length < disks_needed | d(1)", "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Thursday 16 April 2026 19:23:16 -0400 (0:00:00.256) 0:00:52.680 ******** ok: [managed-node13] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:84 Thursday 16 April 2026 19:23:16 -0400 (0:00:00.284) 0:00:52.965 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node13 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Thursday 16 April 2026 19:23:17 -0400 (0:00:00.516) 0:00:53.482 ******** ok: [managed-node13] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Thursday 16 April 2026 19:23:17 -0400 (0:00:00.305) 0:00:53.788 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node13 TASK [Clear facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Thursday 16 April 2026 19:23:18 -0400 (0:00:00.282) 0:00:54.070 ******** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Thursday 16 April 2026 19:23:18 -0400 (0:00:00.022) 0:00:54.092 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Thursday 16 April 2026 19:23:18 -0400 (0:00:00.203) 0:00:54.295 ******** included: fedora.linux_system_roles.storage for managed-node13 TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 16 April 2026 19:23:18 -0400 (0:00:00.419) 0:00:54.715 ******** ok: [managed-node13] => { "ansible_facts": { "discovered_interpreter_python": "/usr/bin/python3.9" }, "changed": false, "message": "Message written to syslog" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:7 Thursday 16 April 2026 19:23:20 -0400 (0:00:01.443) 0:00:56.158 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 16 April 2026 19:23:20 -0400 (0:00:00.262) 0:00:56.421 ******** ok: [managed-node13] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 16 April 2026 19:23:22 -0400 (0:00:01.746) 0:00:58.168 ******** skipping: [managed-node13] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Thursday 16 April 2026 19:23:22 -0400 (0:00:00.826) 0:00:58.995 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Thursday 16 April 2026 19:23:23 -0400 (0:00:00.164) 0:00:59.159 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:10 Thursday 16 April 2026 19:23:23 -0400 (0:00:00.184) 0:00:59.344 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:14 Thursday 16 April 2026 19:23:23 -0400 (0:00:00.155) 0:00:59.500 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:18 Thursday 16 April 2026 19:23:23 -0400 (0:00:00.175) 0:00:59.675 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Thursday 16 April 2026 19:23:24 -0400 (0:00:00.726) 0:01:00.402 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Thursday 16 April 2026 19:23:24 -0400 (0:00:00.269) 0:01:00.672 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Thursday 16 April 2026 19:23:24 -0400 (0:00:00.254) 0:01:00.926 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Thursday 16 April 2026 19:23:26 -0400 (0:00:01.831) 0:01:02.757 ******** ok: [managed-node13] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Thursday 16 April 2026 19:23:26 -0400 (0:00:00.243) 0:01:03.000 ******** ok: [managed-node13] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Thursday 16 April 2026 19:23:27 -0400 (0:00:00.332) 0:01:03.333 ******** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Thursday 16 April 2026 19:23:29 -0400 (0:00:01.781) 0:01:05.114 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 16 April 2026 19:23:29 -0400 (0:00:00.349) 0:01:05.464 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 16 April 2026 19:23:29 -0400 (0:00:00.106) 0:01:05.571 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Thursday 16 April 2026 19:23:29 -0400 (0:00:00.151) 0:01:05.722 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Thursday 16 April 2026 19:23:29 -0400 (0:00:00.124) 0:01:05.847 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Thursday 16 April 2026 19:23:31 -0400 (0:00:01.782) 0:01:07.629 ******** ok: [managed-node13] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Thursday 16 April 2026 19:23:34 -0400 (0:00:02.482) 0:01:10.112 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Thursday 16 April 2026 19:23:34 -0400 (0:00:00.276) 0:01:10.393 ******** fatal: [managed-node13]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Thursday 16 April 2026 19:23:36 -0400 (0:00:01.874) 0:01:12.267 ******** fatal: [managed-node13]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "encrypted volume 'foo' missing key/password", 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Thursday 16 April 2026 19:23:36 -0400 (0:00:00.422) 0:01:12.690 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Thursday 16 April 2026 19:23:37 -0400 (0:00:00.394) 0:01:13.085 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Thursday 16 April 2026 19:23:37 -0400 (0:00:00.239) 0:01:13.325 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Thursday 16 April 2026 19:23:37 -0400 (0:00:00.395) 0:01:13.720 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:99 Thursday 16 April 2026 19:23:37 -0400 (0:00:00.213) 0:01:13.933 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node13 TASK [Clear facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Thursday 16 April 2026 19:23:38 -0400 (0:00:00.422) 0:01:14.356 ******** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Thursday 16 April 2026 19:23:38 -0400 (0:00:00.012) 0:01:14.369 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Thursday 16 April 2026 19:23:38 -0400 (0:00:00.144) 0:01:14.513 ******** included: fedora.linux_system_roles.storage for managed-node13 TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 16 April 2026 19:23:38 -0400 (0:00:00.183) 0:01:14.697 ******** ok: [managed-node13] => { "ansible_facts": { "discovered_interpreter_python": "/usr/bin/python3.9" }, "changed": false, "message": "Message written to syslog" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:7 Thursday 16 April 2026 19:23:39 -0400 (0:00:01.293) 0:01:15.990 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 16 April 2026 19:23:40 -0400 (0:00:00.295) 0:01:16.286 ******** ok: [managed-node13] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 16 April 2026 19:23:41 -0400 (0:00:01.707) 0:01:17.993 ******** skipping: [managed-node13] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Thursday 16 April 2026 19:23:42 -0400 (0:00:00.595) 0:01:18.588 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Thursday 16 April 2026 19:23:42 -0400 (0:00:00.238) 0:01:18.827 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:10 Thursday 16 April 2026 19:23:42 -0400 (0:00:00.161) 0:01:18.988 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:14 Thursday 16 April 2026 19:23:43 -0400 (0:00:00.180) 0:01:19.169 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:18 Thursday 16 April 2026 19:23:43 -0400 (0:00:00.249) 0:01:19.418 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Thursday 16 April 2026 19:23:44 -0400 (0:00:00.650) 0:01:20.069 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Thursday 16 April 2026 19:23:44 -0400 (0:00:00.232) 0:01:20.301 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Thursday 16 April 2026 19:23:44 -0400 (0:00:00.192) 0:01:20.494 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Thursday 16 April 2026 19:23:46 -0400 (0:00:01.856) 0:01:22.351 ******** ok: [managed-node13] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Thursday 16 April 2026 19:23:46 -0400 (0:00:00.254) 0:01:22.605 ******** ok: [managed-node13] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Thursday 16 April 2026 19:23:46 -0400 (0:00:00.262) 0:01:22.867 ******** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Thursday 16 April 2026 19:23:48 -0400 (0:00:01.963) 0:01:24.831 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 16 April 2026 19:23:49 -0400 (0:00:00.225) 0:01:25.057 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 16 April 2026 19:23:49 -0400 (0:00:00.150) 0:01:25.208 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Thursday 16 April 2026 19:23:49 -0400 (0:00:00.300) 0:01:25.509 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Thursday 16 April 2026 19:23:49 -0400 (0:00:00.178) 0:01:25.688 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Thursday 16 April 2026 19:23:52 -0400 (0:00:02.610) 0:01:28.298 ******** ok: [managed-node13] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Thursday 16 April 2026 19:23:55 -0400 (0:00:02.754) 0:01:31.052 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Thursday 16 April 2026 19:23:55 -0400 (0:00:00.482) 0:01:31.539 ******** changed: [managed-node13] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Thursday 16 April 2026 19:24:06 -0400 (0:00:11.323) 0:01:42.862 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Thursday 16 April 2026 19:24:07 -0400 (0:00:00.261) 0:01:43.124 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776381690.0546222, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "00005ca9454e591399a89ce4ad24e534aeafadc6", "ctime": 1776381688.7176218, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 218104009, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776381688.7176218, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1408, "uid": 0, "version": "292099060", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Thursday 16 April 2026 19:24:08 -0400 (0:00:01.266) 0:01:44.390 ******** ok: [managed-node13] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Thursday 16 April 2026 19:24:11 -0400 (0:00:03.489) 0:01:47.879 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Thursday 16 April 2026 19:24:12 -0400 (0:00:00.460) 0:01:48.340 ******** ok: [managed-node13] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 16 April 2026 19:24:12 -0400 (0:00:00.341) 0:01:48.697 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Thursday 16 April 2026 19:24:12 -0400 (0:00:00.223) 0:01:48.920 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Thursday 16 April 2026 19:24:13 -0400 (0:00:00.269) 0:01:49.190 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Thursday 16 April 2026 19:24:13 -0400 (0:00:00.250) 0:01:49.440 ******** ok: [managed-node13] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Thursday 16 April 2026 19:24:19 -0400 (0:00:05.992) 0:01:55.432 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node13] => (item={'src': '/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Thursday 16 April 2026 19:24:23 -0400 (0:00:03.692) 0:01:59.124 ******** skipping: [managed-node13] => (item={'src': '/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Thursday 16 April 2026 19:24:23 -0400 (0:00:00.315) 0:01:59.439 ******** ok: [managed-node13] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Thursday 16 April 2026 19:24:25 -0400 (0:00:01.617) 0:02:01.057 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776380452.4972835, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776150304.367, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 4194436, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1776149993.525, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "2057026748", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Thursday 16 April 2026 19:24:26 -0400 (0:00:01.244) 0:02:02.301 ******** changed: [managed-node13] => (item={'backing_device': '/dev/sda', 'name': 'luks-081a4f92-2987-47ea-b6a5-2bc265a88537', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Thursday 16 April 2026 19:24:27 -0400 (0:00:01.366) 0:02:03.668 ******** ok: [managed-node13] TASK [Verify role results] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:110 Thursday 16 April 2026 19:24:29 -0400 (0:00:01.748) 0:02:05.418 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node13 TASK [Print out pool information] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Thursday 16 April 2026 19:24:30 -0400 (0:00:00.619) 0:02:06.038 ******** skipping: [managed-node13] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Thursday 16 April 2026 19:24:30 -0400 (0:00:00.267) 0:02:06.306 ******** ok: [managed-node13] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Thursday 16 April 2026 19:24:30 -0400 (0:00:00.360) 0:02:06.666 ******** ok: [managed-node13] => { "changed": false, "info": { "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "size": "10G", "type": "crypt", "uuid": "9af0c40d-0567-483a-a7e5-068b325b0642" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "081a4f92-2987-47ea-b6a5-2bc265a88537" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fb6c01b6-de24-4b5b-a70c-3231ceae3fbd" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Thursday 16 April 2026 19:24:33 -0400 (0:00:03.029) 0:02:09.696 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003029", "end": "2026-04-16 19:24:36.135999", "rc": 0, "start": "2026-04-16 19:24:36.132970" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Apr 14 06:59:53 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fb6c01b6-de24-4b5b-a70c-3231ceae3fbd / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Thursday 16 April 2026 19:24:36 -0400 (0:00:02.622) 0:02:12.319 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003093", "end": "2026-04-16 19:24:37.391057", "failed_when_result": false, "rc": 0, "start": "2026-04-16 19:24:37.387964" } STDOUT: luks-081a4f92-2987-47ea-b6a5-2bc265a88537 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Thursday 16 April 2026 19:24:37 -0400 (0:00:01.249) 0:02:13.568 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Thursday 16 April 2026 19:24:37 -0400 (0:00:00.168) 0:02:13.737 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node13 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537', '_raw_device': '/dev/sda', '_mount_id': '/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Thursday 16 April 2026 19:24:38 -0400 (0:00:00.492) 0:02:14.229 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Thursday 16 April 2026 19:24:38 -0400 (0:00:00.301) 0:02:14.531 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node13 => (item=mount) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node13 => (item=fstab) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node13 => (item=fs) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node13 => (item=device) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node13 => (item=encryption) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node13 => (item=md) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node13 => (item=size) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node13 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Thursday 16 April 2026 19:24:40 -0400 (0:00:02.025) 0:02:16.557 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Thursday 16 April 2026 19:24:40 -0400 (0:00:00.406) 0:02:16.965 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Thursday 16 April 2026 19:24:41 -0400 (0:00:00.408) 0:02:17.374 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Thursday 16 April 2026 19:24:41 -0400 (0:00:00.440) 0:02:17.814 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Thursday 16 April 2026 19:24:42 -0400 (0:00:00.289) 0:02:18.103 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Thursday 16 April 2026 19:24:42 -0400 (0:00:00.380) 0:02:18.484 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Thursday 16 April 2026 19:24:42 -0400 (0:00:00.413) 0:02:18.897 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Thursday 16 April 2026 19:24:43 -0400 (0:00:00.325) 0:02:19.223 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Thursday 16 April 2026 19:24:43 -0400 (0:00:00.209) 0:02:19.432 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Thursday 16 April 2026 19:24:43 -0400 (0:00:00.200) 0:02:19.633 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Thursday 16 April 2026 19:24:43 -0400 (0:00:00.222) 0:02:19.855 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Thursday 16 April 2026 19:24:44 -0400 (0:00:00.293) 0:02:20.148 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Thursday 16 April 2026 19:24:44 -0400 (0:00:00.701) 0:02:20.849 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Thursday 16 April 2026 19:24:45 -0400 (0:00:00.314) 0:02:21.164 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Thursday 16 April 2026 19:24:45 -0400 (0:00:00.273) 0:02:21.438 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Thursday 16 April 2026 19:24:45 -0400 (0:00:00.235) 0:02:21.674 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Thursday 16 April 2026 19:24:46 -0400 (0:00:00.354) 0:02:22.028 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Thursday 16 April 2026 19:24:46 -0400 (0:00:00.218) 0:02:22.246 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Thursday 16 April 2026 19:24:46 -0400 (0:00:00.335) 0:02:22.582 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Thursday 16 April 2026 19:24:46 -0400 (0:00:00.361) 0:02:22.944 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776381846.3146665, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1776381846.3146665, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 450, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776381846.3146665, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Thursday 16 April 2026 19:24:48 -0400 (0:00:01.153) 0:02:24.097 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Thursday 16 April 2026 19:24:48 -0400 (0:00:00.263) 0:02:24.360 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Thursday 16 April 2026 19:24:48 -0400 (0:00:00.214) 0:02:24.575 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Thursday 16 April 2026 19:24:48 -0400 (0:00:00.361) 0:02:24.937 ******** ok: [managed-node13] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Thursday 16 April 2026 19:24:49 -0400 (0:00:00.268) 0:02:25.205 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Thursday 16 April 2026 19:24:49 -0400 (0:00:00.196) 0:02:25.402 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Thursday 16 April 2026 19:24:49 -0400 (0:00:00.287) 0:02:25.689 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776381846.5606666, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1776381846.5606666, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 998, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776381846.5606666, "nlink": 1, "path": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Thursday 16 April 2026 19:24:50 -0400 (0:00:01.291) 0:02:26.980 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Thursday 16 April 2026 19:24:52 -0400 (0:00:01.785) 0:02:28.766 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.006987", "end": "2026-04-16 19:24:53.894544", "rc": 0, "start": "2026-04-16 19:24:53.887557" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 081a4f92-2987-47ea-b6a5-2bc265a88537 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 686240 Threads: 2 Salt: 0f b0 d2 cc 41 da ee 60 f7 6a 3c a1 8f 2d a1 95 69 9a 91 87 7d ba 7c a4 12 2a 61 85 cb 6c 64 62 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 131334 Salt: be 48 53 9c 6f 7f 79 03 bd c5 42 4b 82 2c 34 c9 29 3f 3d 17 b2 d2 03 8e e8 61 f4 03 45 6c db 40 Digest: cb 21 cd 7a bb b6 5c 0a 52 71 ea 79 d8 00 25 ce 1f d4 45 2f 45 fd 9a bb 24 59 b7 9b 2a 0a 86 75 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Thursday 16 April 2026 19:24:54 -0400 (0:00:01.316) 0:02:30.083 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Thursday 16 April 2026 19:24:54 -0400 (0:00:00.261) 0:02:30.345 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Thursday 16 April 2026 19:24:54 -0400 (0:00:00.330) 0:02:30.675 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Thursday 16 April 2026 19:24:54 -0400 (0:00:00.345) 0:02:31.021 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Thursday 16 April 2026 19:24:55 -0400 (0:00:00.316) 0:02:31.337 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.encryption_luks_version is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Thursday 16 April 2026 19:24:55 -0400 (0:00:00.343) 0:02:31.681 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Thursday 16 April 2026 19:24:55 -0400 (0:00:00.324) 0:02:32.005 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Thursday 16 April 2026 19:24:56 -0400 (0:00:00.263) 0:02:32.269 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-081a4f92-2987-47ea-b6a5-2bc265a88537 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Thursday 16 April 2026 19:24:56 -0400 (0:00:00.459) 0:02:32.728 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Thursday 16 April 2026 19:24:57 -0400 (0:00:00.408) 0:02:33.136 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Thursday 16 April 2026 19:24:57 -0400 (0:00:00.377) 0:02:33.524 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Thursday 16 April 2026 19:24:57 -0400 (0:00:00.332) 0:02:33.857 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Thursday 16 April 2026 19:24:58 -0400 (0:00:00.286) 0:02:34.144 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Thursday 16 April 2026 19:24:58 -0400 (0:00:00.195) 0:02:34.339 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Thursday 16 April 2026 19:24:58 -0400 (0:00:00.209) 0:02:34.549 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Thursday 16 April 2026 19:24:58 -0400 (0:00:00.243) 0:02:34.792 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Thursday 16 April 2026 19:24:59 -0400 (0:00:00.290) 0:02:35.083 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Thursday 16 April 2026 19:24:59 -0400 (0:00:00.186) 0:02:35.269 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Thursday 16 April 2026 19:24:59 -0400 (0:00:00.263) 0:02:35.533 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Thursday 16 April 2026 19:24:59 -0400 (0:00:00.211) 0:02:35.744 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Thursday 16 April 2026 19:24:59 -0400 (0:00:00.215) 0:02:35.959 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Thursday 16 April 2026 19:25:00 -0400 (0:00:00.235) 0:02:36.195 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Thursday 16 April 2026 19:25:00 -0400 (0:00:00.149) 0:02:36.345 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Thursday 16 April 2026 19:25:00 -0400 (0:00:00.228) 0:02:36.573 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Thursday 16 April 2026 19:25:00 -0400 (0:00:00.156) 0:02:36.730 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Thursday 16 April 2026 19:25:00 -0400 (0:00:00.200) 0:02:36.930 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Thursday 16 April 2026 19:25:01 -0400 (0:00:00.220) 0:02:37.151 ******** ok: [managed-node13] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Thursday 16 April 2026 19:25:01 -0400 (0:00:00.308) 0:02:37.460 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Thursday 16 April 2026 19:25:01 -0400 (0:00:00.248) 0:02:37.708 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Thursday 16 April 2026 19:25:01 -0400 (0:00:00.193) 0:02:37.902 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Thursday 16 April 2026 19:25:02 -0400 (0:00:00.222) 0:02:38.125 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Thursday 16 April 2026 19:25:02 -0400 (0:00:00.196) 0:02:38.321 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Thursday 16 April 2026 19:25:02 -0400 (0:00:00.203) 0:02:38.524 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Thursday 16 April 2026 19:25:02 -0400 (0:00:00.315) 0:02:38.839 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Thursday 16 April 2026 19:25:03 -0400 (0:00:00.268) 0:02:39.108 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Thursday 16 April 2026 19:25:03 -0400 (0:00:00.313) 0:02:39.422 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Thursday 16 April 2026 19:25:03 -0400 (0:00:00.342) 0:02:39.764 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Thursday 16 April 2026 19:25:03 -0400 (0:00:00.227) 0:02:39.992 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Thursday 16 April 2026 19:25:04 -0400 (0:00:00.387) 0:02:40.379 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Thursday 16 April 2026 19:25:04 -0400 (0:00:00.364) 0:02:40.744 ******** skipping: [managed-node13] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Thursday 16 April 2026 19:25:05 -0400 (0:00:00.293) 0:02:41.037 ******** skipping: [managed-node13] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Thursday 16 April 2026 19:25:05 -0400 (0:00:00.285) 0:02:41.323 ******** skipping: [managed-node13] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Thursday 16 April 2026 19:25:05 -0400 (0:00:00.283) 0:02:41.606 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Thursday 16 April 2026 19:25:05 -0400 (0:00:00.275) 0:02:41.882 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Thursday 16 April 2026 19:25:06 -0400 (0:00:00.304) 0:02:42.187 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Thursday 16 April 2026 19:25:06 -0400 (0:00:00.306) 0:02:42.494 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Thursday 16 April 2026 19:25:06 -0400 (0:00:00.311) 0:02:42.805 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Thursday 16 April 2026 19:25:07 -0400 (0:00:00.243) 0:02:43.049 ******** ok: [managed-node13] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Thursday 16 April 2026 19:25:07 -0400 (0:00:00.260) 0:02:43.310 ******** ok: [managed-node13] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Thursday 16 April 2026 19:25:07 -0400 (0:00:00.262) 0:02:43.572 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Thursday 16 April 2026 19:25:07 -0400 (0:00:00.361) 0:02:43.933 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Thursday 16 April 2026 19:25:08 -0400 (0:00:00.732) 0:02:44.665 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Thursday 16 April 2026 19:25:08 -0400 (0:00:00.155) 0:02:44.821 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Thursday 16 April 2026 19:25:09 -0400 (0:00:00.248) 0:02:45.069 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Thursday 16 April 2026 19:25:09 -0400 (0:00:00.222) 0:02:45.292 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Thursday 16 April 2026 19:25:09 -0400 (0:00:00.244) 0:02:45.537 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Thursday 16 April 2026 19:25:09 -0400 (0:00:00.192) 0:02:45.730 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Thursday 16 April 2026 19:25:09 -0400 (0:00:00.218) 0:02:45.948 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Thursday 16 April 2026 19:25:10 -0400 (0:00:00.216) 0:02:46.165 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Thursday 16 April 2026 19:25:10 -0400 (0:00:00.257) 0:02:46.422 ******** changed: [managed-node13] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:116 Thursday 16 April 2026 19:25:13 -0400 (0:00:03.310) 0:02:49.732 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node13 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Thursday 16 April 2026 19:25:14 -0400 (0:00:00.551) 0:02:50.284 ******** ok: [managed-node13] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Thursday 16 April 2026 19:25:14 -0400 (0:00:00.228) 0:02:50.513 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node13 TASK [Clear facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Thursday 16 April 2026 19:25:14 -0400 (0:00:00.351) 0:02:50.864 ******** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Thursday 16 April 2026 19:25:14 -0400 (0:00:00.017) 0:02:50.882 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Thursday 16 April 2026 19:25:15 -0400 (0:00:00.245) 0:02:51.127 ******** included: fedora.linux_system_roles.storage for managed-node13 TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 16 April 2026 19:25:15 -0400 (0:00:00.464) 0:02:51.592 ******** ok: [managed-node13] => { "ansible_facts": { "discovered_interpreter_python": "/usr/bin/python3.9" }, "changed": false, "message": "Message written to syslog" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:7 Thursday 16 April 2026 19:25:16 -0400 (0:00:01.409) 0:02:53.002 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 16 April 2026 19:25:17 -0400 (0:00:00.197) 0:02:53.200 ******** ok: [managed-node13] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 16 April 2026 19:25:18 -0400 (0:00:01.668) 0:02:54.868 ******** skipping: [managed-node13] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Thursday 16 April 2026 19:25:19 -0400 (0:00:00.919) 0:02:55.788 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Thursday 16 April 2026 19:25:20 -0400 (0:00:00.288) 0:02:56.076 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:10 Thursday 16 April 2026 19:25:20 -0400 (0:00:00.288) 0:02:56.365 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:14 Thursday 16 April 2026 19:25:20 -0400 (0:00:00.222) 0:02:56.587 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:18 Thursday 16 April 2026 19:25:20 -0400 (0:00:00.190) 0:02:56.778 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Thursday 16 April 2026 19:25:21 -0400 (0:00:00.808) 0:02:57.586 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Thursday 16 April 2026 19:25:21 -0400 (0:00:00.255) 0:02:57.842 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Thursday 16 April 2026 19:25:22 -0400 (0:00:00.278) 0:02:58.121 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Thursday 16 April 2026 19:25:24 -0400 (0:00:01.932) 0:03:00.053 ******** ok: [managed-node13] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Thursday 16 April 2026 19:25:24 -0400 (0:00:00.236) 0:03:00.290 ******** ok: [managed-node13] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Thursday 16 April 2026 19:25:24 -0400 (0:00:00.368) 0:03:00.660 ******** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Thursday 16 April 2026 19:25:26 -0400 (0:00:01.988) 0:03:02.649 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 16 April 2026 19:25:27 -0400 (0:00:00.465) 0:03:03.114 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 16 April 2026 19:25:27 -0400 (0:00:00.202) 0:03:03.317 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Thursday 16 April 2026 19:25:27 -0400 (0:00:00.205) 0:03:03.522 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Thursday 16 April 2026 19:25:27 -0400 (0:00:00.241) 0:03:03.764 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Thursday 16 April 2026 19:25:29 -0400 (0:00:01.888) 0:03:05.653 ******** ok: [managed-node13] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Thursday 16 April 2026 19:25:32 -0400 (0:00:02.753) 0:03:08.406 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Thursday 16 April 2026 19:25:32 -0400 (0:00:00.397) 0:03:08.804 ******** fatal: [managed-node13]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-081a4f92-2987-47ea-b6a5-2bc265a88537' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Thursday 16 April 2026 19:25:34 -0400 (0:00:02.211) 0:03:11.016 ******** fatal: [managed-node13]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'luks-081a4f92-2987-47ea-b6a5-2bc265a88537' in safe mode due to encryption removal", 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10720641024, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Thursday 16 April 2026 19:25:35 -0400 (0:00:00.257) 0:03:11.273 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Thursday 16 April 2026 19:25:35 -0400 (0:00:00.366) 0:03:11.640 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Thursday 16 April 2026 19:25:35 -0400 (0:00:00.223) 0:03:11.863 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Thursday 16 April 2026 19:25:36 -0400 (0:00:00.427) 0:03:12.291 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Thursday 16 April 2026 19:25:36 -0400 (0:00:00.361) 0:03:12.653 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776381913.5256832, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776381913.5256832, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776381913.5256832, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2961549636", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Thursday 16 April 2026 19:25:37 -0400 (0:00:01.293) 0:03:13.946 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:136 Thursday 16 April 2026 19:25:38 -0400 (0:00:00.352) 0:03:14.299 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node13 TASK [Clear facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Thursday 16 April 2026 19:25:38 -0400 (0:00:00.617) 0:03:14.916 ******** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Thursday 16 April 2026 19:25:38 -0400 (0:00:00.001) 0:03:14.918 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Thursday 16 April 2026 19:25:39 -0400 (0:00:00.243) 0:03:15.161 ******** included: fedora.linux_system_roles.storage for managed-node13 TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 16 April 2026 19:25:39 -0400 (0:00:00.316) 0:03:15.478 ******** ok: [managed-node13] => { "ansible_facts": { "discovered_interpreter_python": "/usr/bin/python3.9" }, "changed": false, "message": "Message written to syslog" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:7 Thursday 16 April 2026 19:25:40 -0400 (0:00:01.434) 0:03:16.913 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 16 April 2026 19:25:41 -0400 (0:00:00.239) 0:03:17.152 ******** ok: [managed-node13] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 16 April 2026 19:25:43 -0400 (0:00:01.887) 0:03:19.039 ******** skipping: [managed-node13] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Thursday 16 April 2026 19:25:43 -0400 (0:00:00.645) 0:03:19.685 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Thursday 16 April 2026 19:25:43 -0400 (0:00:00.338) 0:03:20.024 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:10 Thursday 16 April 2026 19:25:44 -0400 (0:00:00.203) 0:03:20.227 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:14 Thursday 16 April 2026 19:25:44 -0400 (0:00:00.154) 0:03:20.382 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:18 Thursday 16 April 2026 19:25:44 -0400 (0:00:00.186) 0:03:20.568 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Thursday 16 April 2026 19:25:45 -0400 (0:00:00.624) 0:03:21.192 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Thursday 16 April 2026 19:25:45 -0400 (0:00:00.296) 0:03:21.488 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Thursday 16 April 2026 19:25:45 -0400 (0:00:00.276) 0:03:21.764 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Thursday 16 April 2026 19:25:47 -0400 (0:00:02.019) 0:03:23.784 ******** ok: [managed-node13] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Thursday 16 April 2026 19:25:48 -0400 (0:00:00.308) 0:03:24.092 ******** ok: [managed-node13] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Thursday 16 April 2026 19:25:48 -0400 (0:00:00.274) 0:03:24.367 ******** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Thursday 16 April 2026 19:25:50 -0400 (0:00:02.121) 0:03:26.489 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 16 April 2026 19:25:51 -0400 (0:00:00.540) 0:03:27.029 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 16 April 2026 19:25:51 -0400 (0:00:00.188) 0:03:27.218 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Thursday 16 April 2026 19:25:51 -0400 (0:00:00.256) 0:03:27.475 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Thursday 16 April 2026 19:25:51 -0400 (0:00:00.235) 0:03:27.710 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Thursday 16 April 2026 19:25:53 -0400 (0:00:01.888) 0:03:29.599 ******** ok: [managed-node13] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Thursday 16 April 2026 19:25:56 -0400 (0:00:02.846) 0:03:32.445 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Thursday 16 April 2026 19:25:56 -0400 (0:00:00.314) 0:03:32.759 ******** changed: [managed-node13] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=858846c8-d74e-4cd0-9e12-41cb8c400d2b", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=858846c8-d74e-4cd0-9e12-41cb8c400d2b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Thursday 16 April 2026 19:25:59 -0400 (0:00:02.493) 0:03:35.253 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Thursday 16 April 2026 19:25:59 -0400 (0:00:00.146) 0:03:35.399 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776381862.9156706, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a102b70cb53a9cd0661ad0067d24a0ae4b02349c", "ctime": 1776381862.9126706, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 218104009, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776381862.9126706, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1490, "uid": 0, "version": "292099060", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Thursday 16 April 2026 19:26:00 -0400 (0:00:00.970) 0:03:36.369 ******** ok: [managed-node13] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Thursday 16 April 2026 19:26:01 -0400 (0:00:01.048) 0:03:37.418 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Thursday 16 April 2026 19:26:01 -0400 (0:00:00.423) 0:03:37.841 ******** ok: [managed-node13] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=858846c8-d74e-4cd0-9e12-41cb8c400d2b", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=858846c8-d74e-4cd0-9e12-41cb8c400d2b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 16 April 2026 19:26:02 -0400 (0:00:00.218) 0:03:38.060 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Thursday 16 April 2026 19:26:02 -0400 (0:00:00.214) 0:03:38.275 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=858846c8-d74e-4cd0-9e12-41cb8c400d2b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Thursday 16 April 2026 19:26:02 -0400 (0:00:00.258) 0:03:38.533 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node13] => (item={'src': '/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-081a4f92-2987-47ea-b6a5-2bc265a88537" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Thursday 16 April 2026 19:26:03 -0400 (0:00:01.285) 0:03:39.819 ******** ok: [managed-node13] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Thursday 16 April 2026 19:26:05 -0400 (0:00:01.718) 0:03:41.537 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node13] => (item={'src': 'UUID=858846c8-d74e-4cd0-9e12-41cb8c400d2b', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=858846c8-d74e-4cd0-9e12-41cb8c400d2b", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=858846c8-d74e-4cd0-9e12-41cb8c400d2b" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Thursday 16 April 2026 19:26:06 -0400 (0:00:01.341) 0:03:42.879 ******** skipping: [managed-node13] => (item={'src': 'UUID=858846c8-d74e-4cd0-9e12-41cb8c400d2b', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=858846c8-d74e-4cd0-9e12-41cb8c400d2b", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Thursday 16 April 2026 19:26:07 -0400 (0:00:00.389) 0:03:43.268 ******** ok: [managed-node13] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Thursday 16 April 2026 19:26:08 -0400 (0:00:01.572) 0:03:44.840 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776381877.3896742, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "8d02596f617e678695cceb6d51f9e246581ad16f", "ctime": 1776381867.4426718, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 373293262, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776381867.4428964, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "2033866683", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Thursday 16 April 2026 19:26:09 -0400 (0:00:01.139) 0:03:45.980 ******** changed: [managed-node13] => (item={'backing_device': '/dev/sda', 'name': 'luks-081a4f92-2987-47ea-b6a5-2bc265a88537', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Thursday 16 April 2026 19:26:11 -0400 (0:00:01.269) 0:03:47.249 ******** ok: [managed-node13] TASK [Verify role results - 2] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:148 Thursday 16 April 2026 19:26:13 -0400 (0:00:01.879) 0:03:49.129 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node13 TASK [Print out pool information] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Thursday 16 April 2026 19:26:13 -0400 (0:00:00.518) 0:03:49.648 ******** skipping: [managed-node13] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Thursday 16 April 2026 19:26:13 -0400 (0:00:00.186) 0:03:49.835 ******** ok: [managed-node13] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=858846c8-d74e-4cd0-9e12-41cb8c400d2b", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Thursday 16 April 2026 19:26:14 -0400 (0:00:00.233) 0:03:50.068 ******** ok: [managed-node13] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "858846c8-d74e-4cd0-9e12-41cb8c400d2b" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fb6c01b6-de24-4b5b-a70c-3231ceae3fbd" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Thursday 16 April 2026 19:26:15 -0400 (0:00:01.081) 0:03:51.150 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003328", "end": "2026-04-16 19:26:16.002509", "rc": 0, "start": "2026-04-16 19:26:15.999181" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Apr 14 06:59:53 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fb6c01b6-de24-4b5b-a70c-3231ceae3fbd / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 UUID=858846c8-d74e-4cd0-9e12-41cb8c400d2b /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Thursday 16 April 2026 19:26:16 -0400 (0:00:01.038) 0:03:52.188 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002891", "end": "2026-04-16 19:26:17.131025", "failed_when_result": false, "rc": 0, "start": "2026-04-16 19:26:17.128134" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Thursday 16 April 2026 19:26:17 -0400 (0:00:01.123) 0:03:53.312 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Thursday 16 April 2026 19:26:17 -0400 (0:00:00.149) 0:03:53.462 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node13 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10720641024, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/sda', '_raw_device': '/dev/sda', '_mount_id': 'UUID=858846c8-d74e-4cd0-9e12-41cb8c400d2b', '_kernel_device': '/dev/sda', '_raw_kernel_device': '/dev/sda'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Thursday 16 April 2026 19:26:17 -0400 (0:00:00.435) 0:03:53.898 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Thursday 16 April 2026 19:26:18 -0400 (0:00:00.272) 0:03:54.170 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node13 => (item=mount) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node13 => (item=fstab) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node13 => (item=fs) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node13 => (item=device) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node13 => (item=encryption) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node13 => (item=md) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node13 => (item=size) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node13 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Thursday 16 April 2026 19:26:19 -0400 (0:00:01.718) 0:03:55.889 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Thursday 16 April 2026 19:26:20 -0400 (0:00:00.409) 0:03:56.298 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Thursday 16 April 2026 19:26:20 -0400 (0:00:00.433) 0:03:56.732 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Thursday 16 April 2026 19:26:21 -0400 (0:00:00.414) 0:03:57.147 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Thursday 16 April 2026 19:26:21 -0400 (0:00:00.336) 0:03:57.484 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Thursday 16 April 2026 19:26:21 -0400 (0:00:00.298) 0:03:57.782 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Thursday 16 April 2026 19:26:22 -0400 (0:00:00.348) 0:03:58.131 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Thursday 16 April 2026 19:26:22 -0400 (0:00:00.242) 0:03:58.373 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Thursday 16 April 2026 19:26:22 -0400 (0:00:00.176) 0:03:58.549 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Thursday 16 April 2026 19:26:22 -0400 (0:00:00.248) 0:03:58.797 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Thursday 16 April 2026 19:26:22 -0400 (0:00:00.175) 0:03:58.973 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Thursday 16 April 2026 19:26:23 -0400 (0:00:00.235) 0:03:59.209 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=858846c8-d74e-4cd0-9e12-41cb8c400d2b " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Thursday 16 April 2026 19:26:23 -0400 (0:00:00.661) 0:03:59.870 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Thursday 16 April 2026 19:26:24 -0400 (0:00:00.341) 0:04:00.211 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Thursday 16 April 2026 19:26:24 -0400 (0:00:00.321) 0:04:00.533 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Thursday 16 April 2026 19:26:24 -0400 (0:00:00.211) 0:04:00.745 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Thursday 16 April 2026 19:26:25 -0400 (0:00:00.282) 0:04:01.027 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Thursday 16 April 2026 19:26:25 -0400 (0:00:00.233) 0:04:01.261 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Thursday 16 April 2026 19:26:26 -0400 (0:00:00.889) 0:04:02.150 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Thursday 16 April 2026 19:26:26 -0400 (0:00:00.410) 0:04:02.561 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776381959.0396943, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1776381959.0396943, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 450, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776381959.0396943, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Thursday 16 April 2026 19:26:27 -0400 (0:00:01.334) 0:04:03.895 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Thursday 16 April 2026 19:26:28 -0400 (0:00:00.239) 0:04:04.134 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Thursday 16 April 2026 19:26:28 -0400 (0:00:00.280) 0:04:04.414 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Thursday 16 April 2026 19:26:28 -0400 (0:00:00.376) 0:04:04.791 ******** ok: [managed-node13] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Thursday 16 April 2026 19:26:29 -0400 (0:00:00.269) 0:04:05.061 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Thursday 16 April 2026 19:26:29 -0400 (0:00:00.223) 0:04:05.285 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Thursday 16 April 2026 19:26:29 -0400 (0:00:00.251) 0:04:05.537 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Thursday 16 April 2026 19:26:29 -0400 (0:00:00.230) 0:04:05.767 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Thursday 16 April 2026 19:26:31 -0400 (0:00:01.851) 0:04:07.619 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Thursday 16 April 2026 19:26:31 -0400 (0:00:00.133) 0:04:07.752 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Thursday 16 April 2026 19:26:31 -0400 (0:00:00.168) 0:04:07.921 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Thursday 16 April 2026 19:26:32 -0400 (0:00:00.237) 0:04:08.159 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Thursday 16 April 2026 19:26:32 -0400 (0:00:00.147) 0:04:08.307 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Thursday 16 April 2026 19:26:32 -0400 (0:00:00.141) 0:04:08.449 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Thursday 16 April 2026 19:26:32 -0400 (0:00:00.218) 0:04:08.667 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Thursday 16 April 2026 19:26:32 -0400 (0:00:00.192) 0:04:08.859 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Thursday 16 April 2026 19:26:32 -0400 (0:00:00.144) 0:04:09.004 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Thursday 16 April 2026 19:26:33 -0400 (0:00:00.430) 0:04:09.434 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Thursday 16 April 2026 19:26:33 -0400 (0:00:00.224) 0:04:09.658 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Thursday 16 April 2026 19:26:33 -0400 (0:00:00.173) 0:04:09.831 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Thursday 16 April 2026 19:26:33 -0400 (0:00:00.186) 0:04:10.018 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Thursday 16 April 2026 19:26:34 -0400 (0:00:00.261) 0:04:10.279 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Thursday 16 April 2026 19:26:34 -0400 (0:00:00.305) 0:04:10.585 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Thursday 16 April 2026 19:26:34 -0400 (0:00:00.199) 0:04:10.784 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Thursday 16 April 2026 19:26:34 -0400 (0:00:00.203) 0:04:10.987 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Thursday 16 April 2026 19:26:35 -0400 (0:00:00.186) 0:04:11.174 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Thursday 16 April 2026 19:26:35 -0400 (0:00:00.247) 0:04:11.421 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Thursday 16 April 2026 19:26:35 -0400 (0:00:00.180) 0:04:11.602 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Thursday 16 April 2026 19:26:35 -0400 (0:00:00.181) 0:04:11.783 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Thursday 16 April 2026 19:26:36 -0400 (0:00:00.248) 0:04:12.032 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Thursday 16 April 2026 19:26:36 -0400 (0:00:00.132) 0:04:12.165 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Thursday 16 April 2026 19:26:36 -0400 (0:00:00.189) 0:04:12.354 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Thursday 16 April 2026 19:26:36 -0400 (0:00:00.268) 0:04:12.623 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Thursday 16 April 2026 19:26:36 -0400 (0:00:00.165) 0:04:12.789 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Thursday 16 April 2026 19:26:37 -0400 (0:00:00.342) 0:04:13.131 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Thursday 16 April 2026 19:26:37 -0400 (0:00:00.219) 0:04:13.351 ******** ok: [managed-node13] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Thursday 16 April 2026 19:26:37 -0400 (0:00:00.302) 0:04:13.653 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Thursday 16 April 2026 19:26:37 -0400 (0:00:00.232) 0:04:13.886 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Thursday 16 April 2026 19:26:38 -0400 (0:00:00.211) 0:04:14.097 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Thursday 16 April 2026 19:26:38 -0400 (0:00:00.254) 0:04:14.352 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Thursday 16 April 2026 19:26:38 -0400 (0:00:00.265) 0:04:14.618 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Thursday 16 April 2026 19:26:38 -0400 (0:00:00.254) 0:04:14.873 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Thursday 16 April 2026 19:26:39 -0400 (0:00:00.247) 0:04:15.120 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Thursday 16 April 2026 19:26:39 -0400 (0:00:00.294) 0:04:15.415 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Thursday 16 April 2026 19:26:39 -0400 (0:00:00.296) 0:04:15.711 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Thursday 16 April 2026 19:26:40 -0400 (0:00:00.325) 0:04:16.037 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Thursday 16 April 2026 19:26:40 -0400 (0:00:00.353) 0:04:16.391 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Thursday 16 April 2026 19:26:40 -0400 (0:00:00.273) 0:04:16.664 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Thursday 16 April 2026 19:26:40 -0400 (0:00:00.323) 0:04:16.987 ******** skipping: [managed-node13] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Thursday 16 April 2026 19:26:41 -0400 (0:00:00.421) 0:04:17.409 ******** skipping: [managed-node13] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Thursday 16 April 2026 19:26:41 -0400 (0:00:00.321) 0:04:17.731 ******** skipping: [managed-node13] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Thursday 16 April 2026 19:26:42 -0400 (0:00:00.323) 0:04:18.055 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Thursday 16 April 2026 19:26:42 -0400 (0:00:00.265) 0:04:18.320 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Thursday 16 April 2026 19:26:42 -0400 (0:00:00.342) 0:04:18.663 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Thursday 16 April 2026 19:26:42 -0400 (0:00:00.263) 0:04:18.926 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Thursday 16 April 2026 19:26:43 -0400 (0:00:00.288) 0:04:19.215 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Thursday 16 April 2026 19:26:43 -0400 (0:00:00.254) 0:04:19.469 ******** ok: [managed-node13] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Thursday 16 April 2026 19:26:43 -0400 (0:00:00.264) 0:04:19.733 ******** ok: [managed-node13] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Thursday 16 April 2026 19:26:43 -0400 (0:00:00.283) 0:04:20.017 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Thursday 16 April 2026 19:26:44 -0400 (0:00:00.212) 0:04:20.230 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Thursday 16 April 2026 19:26:44 -0400 (0:00:00.180) 0:04:20.411 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Thursday 16 April 2026 19:26:44 -0400 (0:00:00.227) 0:04:20.638 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Thursday 16 April 2026 19:26:44 -0400 (0:00:00.185) 0:04:20.824 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Thursday 16 April 2026 19:26:45 -0400 (0:00:00.238) 0:04:21.062 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Thursday 16 April 2026 19:26:45 -0400 (0:00:00.264) 0:04:21.327 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Thursday 16 April 2026 19:26:45 -0400 (0:00:00.183) 0:04:21.511 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Thursday 16 April 2026 19:26:45 -0400 (0:00:00.144) 0:04:21.655 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Thursday 16 April 2026 19:26:45 -0400 (0:00:00.244) 0:04:21.899 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Thursday 16 April 2026 19:26:46 -0400 (0:00:00.279) 0:04:22.179 ******** changed: [managed-node13] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 2] ****************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:154 Thursday 16 April 2026 19:26:47 -0400 (0:00:01.300) 0:04:23.479 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node13 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Thursday 16 April 2026 19:26:48 -0400 (0:00:00.839) 0:04:24.319 ******** ok: [managed-node13] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Thursday 16 April 2026 19:26:48 -0400 (0:00:00.383) 0:04:24.702 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node13 TASK [Clear facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Thursday 16 April 2026 19:26:49 -0400 (0:00:00.374) 0:04:25.077 ******** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Thursday 16 April 2026 19:26:49 -0400 (0:00:00.015) 0:04:25.092 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Thursday 16 April 2026 19:26:49 -0400 (0:00:00.243) 0:04:25.336 ******** included: fedora.linux_system_roles.storage for managed-node13 TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 16 April 2026 19:26:49 -0400 (0:00:00.341) 0:04:25.677 ******** ok: [managed-node13] => { "ansible_facts": { "discovered_interpreter_python": "/usr/bin/python3.9" }, "changed": false, "message": "Message written to syslog" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:7 Thursday 16 April 2026 19:26:51 -0400 (0:00:01.445) 0:04:27.122 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 16 April 2026 19:26:51 -0400 (0:00:00.288) 0:04:27.410 ******** ok: [managed-node13] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 16 April 2026 19:26:53 -0400 (0:00:01.897) 0:04:29.308 ******** skipping: [managed-node13] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Thursday 16 April 2026 19:26:54 -0400 (0:00:00.796) 0:04:30.105 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Thursday 16 April 2026 19:26:54 -0400 (0:00:00.235) 0:04:30.341 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:10 Thursday 16 April 2026 19:26:54 -0400 (0:00:00.272) 0:04:30.613 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:14 Thursday 16 April 2026 19:26:54 -0400 (0:00:00.156) 0:04:30.770 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:18 Thursday 16 April 2026 19:26:54 -0400 (0:00:00.197) 0:04:30.967 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Thursday 16 April 2026 19:26:55 -0400 (0:00:00.627) 0:04:31.595 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Thursday 16 April 2026 19:26:55 -0400 (0:00:00.276) 0:04:31.913 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Thursday 16 April 2026 19:26:56 -0400 (0:00:00.321) 0:04:32.235 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Thursday 16 April 2026 19:26:58 -0400 (0:00:02.231) 0:04:34.466 ******** ok: [managed-node13] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Thursday 16 April 2026 19:26:58 -0400 (0:00:00.292) 0:04:34.759 ******** ok: [managed-node13] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Thursday 16 April 2026 19:26:59 -0400 (0:00:00.290) 0:04:35.049 ******** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Thursday 16 April 2026 19:27:01 -0400 (0:00:02.133) 0:04:37.183 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 16 April 2026 19:27:01 -0400 (0:00:00.307) 0:04:37.491 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 16 April 2026 19:27:01 -0400 (0:00:00.182) 0:04:37.674 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Thursday 16 April 2026 19:27:01 -0400 (0:00:00.151) 0:04:37.825 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Thursday 16 April 2026 19:27:01 -0400 (0:00:00.135) 0:04:37.960 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Thursday 16 April 2026 19:27:03 -0400 (0:00:01.732) 0:04:39.693 ******** ok: [managed-node13] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d081a4f92\\x2d2987\\x2d47ea\\x2db6a5\\x2d2bc265a88537.service": { "name": "systemd-cryptsetup@luks\\x2d081a4f92\\x2d2987\\x2d47ea\\x2db6a5\\x2d2bc265a88537.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Thursday 16 April 2026 19:27:06 -0400 (0:00:02.955) 0:04:42.649 ******** changed: [managed-node13] => (item=systemd-cryptsetup@luks\x2d081a4f92\x2d2987\x2d47ea\x2db6a5\x2d2bc265a88537.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d081a4f92\\x2d2987\\x2d47ea\\x2db6a5\\x2d2bc265a88537.service", "name": "systemd-cryptsetup@luks\\x2d081a4f92\\x2d2987\\x2d47ea\\x2db6a5\\x2d2bc265a88537.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target systemd-journald.socket \"system-systemd\\\\x2dcryptsetup.slice\" systemd-udevd-kernel.socket dev-sda.device", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "\"blockdev@dev-mapper-luks\\\\x2d081a4f92\\\\x2d2987\\\\x2d47ea\\\\x2db6a5\\\\x2d2bc265a88537.target\" umount.target cryptsetup.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-081a4f92-2987-47ea-b6a5-2bc265a88537", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-081a4f92-2987-47ea-b6a5-2bc265a88537 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-081a4f92-2987-47ea-b6a5-2bc265a88537 /dev/sda - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-081a4f92-2987-47ea-b6a5-2bc265a88537 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-081a4f92-2987-47ea-b6a5-2bc265a88537 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d081a4f92\\x2d2987\\x2d47ea\\x2db6a5\\x2d2bc265a88537.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d081a4f92\\x2d2987\\x2d47ea\\x2db6a5\\x2d2bc265a88537.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13683", "LimitNPROCSoft": "13683", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13683", "LimitSIGPENDINGSoft": "13683", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d081a4f92\\\\x2d2987\\\\x2d47ea\\\\x2db6a5\\\\x2d2bc265a88537.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2026-04-16 19:26:08 EDT", "StateChangeTimestampMonotonic": "1779031492", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21893", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d081a4f92\\\\x2d2987\\\\x2d47ea\\\\x2db6a5\\\\x2d2bc265a88537.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Thursday 16 April 2026 19:27:08 -0400 (0:00:01.789) 0:04:44.438 ******** fatal: [managed-node13]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Thursday 16 April 2026 19:27:10 -0400 (0:00:02.087) 0:04:46.526 ******** fatal: [managed-node13]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", 'invocation': {'module_args': {'pools': [], 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Thursday 16 April 2026 19:27:10 -0400 (0:00:00.198) 0:04:46.724 ******** changed: [managed-node13] => (item=systemd-cryptsetup@luks\x2d081a4f92\x2d2987\x2d47ea\x2db6a5\x2d2bc265a88537.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d081a4f92\\x2d2987\\x2d47ea\\x2db6a5\\x2d2bc265a88537.service", "name": "systemd-cryptsetup@luks\\x2d081a4f92\\x2d2987\\x2d47ea\\x2db6a5\\x2d2bc265a88537.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d081a4f92\\x2d2987\\x2d47ea\\x2db6a5\\x2d2bc265a88537.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d081a4f92\\x2d2987\\x2d47ea\\x2db6a5\\x2d2bc265a88537.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d081a4f92\\x2d2987\\x2d47ea\\x2db6a5\\x2d2bc265a88537.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454168576", "LimitMEMLOCKSoft": "454168576", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13683", "LimitNPROCSoft": "13683", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13683", "LimitSIGPENDINGSoft": "13683", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d081a4f92\\x2d2987\\x2d47ea\\x2db6a5\\x2d2bc265a88537.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d081a4f92\\\\x2d2987\\\\x2d47ea\\\\x2db6a5\\\\x2d2bc265a88537.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21893", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Thursday 16 April 2026 19:27:12 -0400 (0:00:01.602) 0:04:48.327 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Thursday 16 April 2026 19:27:12 -0400 (0:00:00.258) 0:04:48.585 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Thursday 16 April 2026 19:27:12 -0400 (0:00:00.397) 0:04:48.983 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Thursday 16 April 2026 19:27:13 -0400 (0:00:00.299) 0:04:49.282 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382007.2847064, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776382007.2847064, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776382007.2847064, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2192063821", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Thursday 16 April 2026 19:27:14 -0400 (0:00:01.135) 0:04:50.418 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:174 Thursday 16 April 2026 19:27:14 -0400 (0:00:00.262) 0:04:50.681 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node13 TASK [Clear facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Thursday 16 April 2026 19:27:15 -0400 (0:00:00.592) 0:04:51.273 ******** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Thursday 16 April 2026 19:27:15 -0400 (0:00:00.016) 0:04:51.290 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Thursday 16 April 2026 19:27:15 -0400 (0:00:00.218) 0:04:51.509 ******** included: fedora.linux_system_roles.storage for managed-node13 TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 16 April 2026 19:27:15 -0400 (0:00:00.382) 0:04:51.891 ******** ok: [managed-node13] => { "ansible_facts": { "discovered_interpreter_python": "/usr/bin/python3.9" }, "changed": false, "message": "Message written to syslog" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:7 Thursday 16 April 2026 19:27:17 -0400 (0:00:01.266) 0:04:53.157 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 16 April 2026 19:27:17 -0400 (0:00:00.185) 0:04:53.343 ******** ok: [managed-node13] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 16 April 2026 19:27:18 -0400 (0:00:01.666) 0:04:55.009 ******** skipping: [managed-node13] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Thursday 16 April 2026 19:27:19 -0400 (0:00:00.780) 0:04:55.790 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Thursday 16 April 2026 19:27:20 -0400 (0:00:00.277) 0:04:56.067 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:10 Thursday 16 April 2026 19:27:20 -0400 (0:00:00.253) 0:04:56.321 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:14 Thursday 16 April 2026 19:27:20 -0400 (0:00:00.238) 0:04:56.559 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:18 Thursday 16 April 2026 19:27:20 -0400 (0:00:00.294) 0:04:56.854 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Thursday 16 April 2026 19:27:21 -0400 (0:00:00.786) 0:04:57.641 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Thursday 16 April 2026 19:27:21 -0400 (0:00:00.305) 0:04:57.946 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Thursday 16 April 2026 19:27:22 -0400 (0:00:00.301) 0:04:58.248 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Thursday 16 April 2026 19:27:24 -0400 (0:00:01.962) 0:05:00.210 ******** ok: [managed-node13] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Thursday 16 April 2026 19:27:24 -0400 (0:00:00.131) 0:05:00.341 ******** ok: [managed-node13] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Thursday 16 April 2026 19:27:24 -0400 (0:00:00.389) 0:05:00.731 ******** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Thursday 16 April 2026 19:27:26 -0400 (0:00:02.121) 0:05:02.853 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 16 April 2026 19:27:27 -0400 (0:00:00.453) 0:05:03.307 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 16 April 2026 19:27:27 -0400 (0:00:00.214) 0:05:03.521 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Thursday 16 April 2026 19:27:27 -0400 (0:00:00.249) 0:05:03.770 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Thursday 16 April 2026 19:27:27 -0400 (0:00:00.199) 0:05:03.970 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Thursday 16 April 2026 19:27:29 -0400 (0:00:02.006) 0:05:05.976 ******** ok: [managed-node13] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Thursday 16 April 2026 19:27:33 -0400 (0:00:03.874) 0:05:09.851 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Thursday 16 April 2026 19:27:34 -0400 (0:00:00.451) 0:05:10.302 ******** changed: [managed-node13] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-0ac178a1-61aa-4226-b754-53f41de902ef", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=858846c8-d74e-4cd0-9e12-41cb8c400d2b", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Thursday 16 April 2026 19:27:45 -0400 (0:00:11.689) 0:05:21.991 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Thursday 16 April 2026 19:27:46 -0400 (0:00:00.191) 0:05:22.183 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776381966.7286963, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "36b1eb8dd554754c6fe516edce5371e95ddc4e15", "ctime": 1776381966.7246962, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 218104009, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776381966.7246962, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1478, "uid": 0, "version": "292099060", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Thursday 16 April 2026 19:27:47 -0400 (0:00:01.332) 0:05:23.521 ******** ok: [managed-node13] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Thursday 16 April 2026 19:27:48 -0400 (0:00:01.293) 0:05:24.815 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Thursday 16 April 2026 19:27:48 -0400 (0:00:00.086) 0:05:24.902 ******** ok: [managed-node13] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-0ac178a1-61aa-4226-b754-53f41de902ef", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=858846c8-d74e-4cd0-9e12-41cb8c400d2b", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 16 April 2026 19:27:49 -0400 (0:00:00.324) 0:05:25.226 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Thursday 16 April 2026 19:27:49 -0400 (0:00:00.217) 0:05:25.444 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Thursday 16 April 2026 19:27:49 -0400 (0:00:00.215) 0:05:25.660 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node13] => (item={'src': 'UUID=858846c8-d74e-4cd0-9e12-41cb8c400d2b', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=858846c8-d74e-4cd0-9e12-41cb8c400d2b", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=858846c8-d74e-4cd0-9e12-41cb8c400d2b" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Thursday 16 April 2026 19:27:50 -0400 (0:00:01.349) 0:05:27.009 ******** ok: [managed-node13] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Thursday 16 April 2026 19:27:52 -0400 (0:00:01.690) 0:05:28.700 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node13] => (item={'src': '/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Thursday 16 April 2026 19:27:54 -0400 (0:00:01.332) 0:05:30.032 ******** skipping: [managed-node13] => (item={'src': '/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Thursday 16 April 2026 19:27:54 -0400 (0:00:00.417) 0:05:30.450 ******** ok: [managed-node13] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Thursday 16 April 2026 19:27:56 -0400 (0:00:01.654) 0:05:32.105 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776381977.130699, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776381971.0566974, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 645923037, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1776381971.056634, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "497359111", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Thursday 16 April 2026 19:27:57 -0400 (0:00:01.319) 0:05:33.424 ******** changed: [managed-node13] => (item={'backing_device': '/dev/sda', 'name': 'luks-0ac178a1-61aa-4226-b754-53f41de902ef', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-0ac178a1-61aa-4226-b754-53f41de902ef", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Thursday 16 April 2026 19:27:58 -0400 (0:00:01.346) 0:05:34.771 ******** ok: [managed-node13] TASK [Verify role results - 3] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:186 Thursday 16 April 2026 19:28:00 -0400 (0:00:01.953) 0:05:36.725 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node13 TASK [Print out pool information] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Thursday 16 April 2026 19:28:01 -0400 (0:00:00.781) 0:05:37.507 ******** skipping: [managed-node13] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Thursday 16 April 2026 19:28:01 -0400 (0:00:00.225) 0:05:37.733 ******** ok: [managed-node13] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Thursday 16 April 2026 19:28:02 -0400 (0:00:00.335) 0:05:38.069 ******** ok: [managed-node13] => { "changed": false, "info": { "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "size": "10G", "type": "crypt", "uuid": "a8d2dc65-a4b1-44d7-94cb-2c0e433c042f" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "0ac178a1-61aa-4226-b754-53f41de902ef" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fb6c01b6-de24-4b5b-a70c-3231ceae3fbd" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Thursday 16 April 2026 19:28:03 -0400 (0:00:01.408) 0:05:39.477 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003076", "end": "2026-04-16 19:28:04.584694", "rc": 0, "start": "2026-04-16 19:28:04.581618" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Apr 14 06:59:53 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fb6c01b6-de24-4b5b-a70c-3231ceae3fbd / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Thursday 16 April 2026 19:28:04 -0400 (0:00:01.346) 0:05:40.824 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002961", "end": "2026-04-16 19:28:05.669964", "failed_when_result": false, "rc": 0, "start": "2026-04-16 19:28:05.667003" } STDOUT: luks-0ac178a1-61aa-4226-b754-53f41de902ef /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Thursday 16 April 2026 19:28:05 -0400 (0:00:01.002) 0:05:41.826 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Thursday 16 April 2026 19:28:05 -0400 (0:00:00.148) 0:05:41.974 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node13 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'present', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef', '_raw_device': '/dev/sda', '_mount_id': '/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Thursday 16 April 2026 19:28:06 -0400 (0:00:00.549) 0:05:42.524 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Thursday 16 April 2026 19:28:06 -0400 (0:00:00.339) 0:05:42.863 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node13 => (item=mount) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node13 => (item=fstab) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node13 => (item=fs) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node13 => (item=device) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node13 => (item=encryption) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node13 => (item=md) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node13 => (item=size) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node13 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Thursday 16 April 2026 19:28:08 -0400 (0:00:01.979) 0:05:44.843 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Thursday 16 April 2026 19:28:09 -0400 (0:00:00.379) 0:05:45.223 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Thursday 16 April 2026 19:28:09 -0400 (0:00:00.453) 0:05:45.677 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Thursday 16 April 2026 19:28:10 -0400 (0:00:00.509) 0:05:46.187 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Thursday 16 April 2026 19:28:10 -0400 (0:00:00.388) 0:05:46.575 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Thursday 16 April 2026 19:28:10 -0400 (0:00:00.338) 0:05:46.914 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Thursday 16 April 2026 19:28:11 -0400 (0:00:00.409) 0:05:47.324 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Thursday 16 April 2026 19:28:11 -0400 (0:00:00.424) 0:05:47.749 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Thursday 16 April 2026 19:28:11 -0400 (0:00:00.219) 0:05:47.968 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Thursday 16 April 2026 19:28:12 -0400 (0:00:00.167) 0:05:48.136 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Thursday 16 April 2026 19:28:12 -0400 (0:00:00.146) 0:05:48.283 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Thursday 16 April 2026 19:28:12 -0400 (0:00:00.209) 0:05:48.493 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Thursday 16 April 2026 19:28:13 -0400 (0:00:00.591) 0:05:49.084 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Thursday 16 April 2026 19:28:13 -0400 (0:00:00.271) 0:05:49.356 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Thursday 16 April 2026 19:28:13 -0400 (0:00:00.427) 0:05:49.784 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Thursday 16 April 2026 19:28:14 -0400 (0:00:00.243) 0:05:50.027 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Thursday 16 April 2026 19:28:14 -0400 (0:00:00.322) 0:05:50.350 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Thursday 16 April 2026 19:28:14 -0400 (0:00:00.249) 0:05:50.599 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Thursday 16 April 2026 19:28:15 -0400 (0:00:00.460) 0:05:51.060 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Thursday 16 April 2026 19:28:15 -0400 (0:00:00.382) 0:05:51.442 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382065.4327207, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1776382065.4327207, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 450, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776382065.4327207, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Thursday 16 April 2026 19:28:16 -0400 (0:00:01.123) 0:05:52.566 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Thursday 16 April 2026 19:28:16 -0400 (0:00:00.284) 0:05:52.850 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Thursday 16 April 2026 19:28:17 -0400 (0:00:00.203) 0:05:53.053 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Thursday 16 April 2026 19:28:17 -0400 (0:00:00.314) 0:05:53.368 ******** ok: [managed-node13] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Thursday 16 April 2026 19:28:17 -0400 (0:00:00.290) 0:05:53.658 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Thursday 16 April 2026 19:28:17 -0400 (0:00:00.168) 0:05:53.827 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Thursday 16 April 2026 19:28:18 -0400 (0:00:00.326) 0:05:54.153 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382065.6757207, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1776382065.6757207, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1126, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776382065.6757207, "nlink": 1, "path": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Thursday 16 April 2026 19:28:19 -0400 (0:00:01.196) 0:05:55.349 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Thursday 16 April 2026 19:28:21 -0400 (0:00:01.711) 0:05:57.061 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.007274", "end": "2026-04-16 19:28:22.109476", "rc": 0, "start": "2026-04-16 19:28:22.102202" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 0ac178a1-61aa-4226-b754-53f41de902ef Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 680793 Threads: 2 Salt: 16 a9 0e f1 02 58 aa 28 8b 21 73 d0 f5 98 15 bd 19 f7 0e 73 bb 44 9c e5 ad fa ce 17 50 49 36 73 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 131995 Salt: 02 f3 e8 8e 86 d5 bf c8 0a ca d3 7b 01 bd e9 f2 b1 75 3b b0 68 f3 0c 41 fa b5 1a 6d 4b 32 2c 9b Digest: 88 6f fa 06 02 e1 ce 92 b3 49 a2 34 63 e7 1f 33 71 31 d9 a8 03 62 b5 c6 10 69 0c 5d a6 5f 07 81 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Thursday 16 April 2026 19:28:22 -0400 (0:00:01.249) 0:05:58.311 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Thursday 16 April 2026 19:28:22 -0400 (0:00:00.329) 0:05:58.640 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Thursday 16 April 2026 19:28:22 -0400 (0:00:00.290) 0:05:58.931 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Thursday 16 April 2026 19:28:23 -0400 (0:00:00.256) 0:05:59.187 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Thursday 16 April 2026 19:28:23 -0400 (0:00:00.229) 0:05:59.417 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.encryption_luks_version is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Thursday 16 April 2026 19:28:23 -0400 (0:00:00.276) 0:05:59.693 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Thursday 16 April 2026 19:28:23 -0400 (0:00:00.195) 0:05:59.889 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Thursday 16 April 2026 19:28:24 -0400 (0:00:00.291) 0:06:00.181 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-0ac178a1-61aa-4226-b754-53f41de902ef /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Thursday 16 April 2026 19:28:24 -0400 (0:00:00.405) 0:06:00.586 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Thursday 16 April 2026 19:28:24 -0400 (0:00:00.346) 0:06:00.933 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Thursday 16 April 2026 19:28:25 -0400 (0:00:00.297) 0:06:01.230 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Thursday 16 April 2026 19:28:25 -0400 (0:00:00.325) 0:06:01.568 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Thursday 16 April 2026 19:28:25 -0400 (0:00:00.292) 0:06:01.860 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Thursday 16 April 2026 19:28:26 -0400 (0:00:00.220) 0:06:02.081 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Thursday 16 April 2026 19:28:26 -0400 (0:00:00.169) 0:06:02.250 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Thursday 16 April 2026 19:28:26 -0400 (0:00:00.184) 0:06:02.434 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Thursday 16 April 2026 19:28:26 -0400 (0:00:00.193) 0:06:02.628 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Thursday 16 April 2026 19:28:26 -0400 (0:00:00.186) 0:06:02.814 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Thursday 16 April 2026 19:28:26 -0400 (0:00:00.148) 0:06:02.964 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Thursday 16 April 2026 19:28:27 -0400 (0:00:00.151) 0:06:03.116 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Thursday 16 April 2026 19:28:27 -0400 (0:00:00.129) 0:06:03.245 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Thursday 16 April 2026 19:28:27 -0400 (0:00:00.233) 0:06:03.479 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Thursday 16 April 2026 19:28:27 -0400 (0:00:00.171) 0:06:03.650 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Thursday 16 April 2026 19:28:27 -0400 (0:00:00.154) 0:06:03.829 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Thursday 16 April 2026 19:28:28 -0400 (0:00:00.226) 0:06:04.056 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Thursday 16 April 2026 19:28:28 -0400 (0:00:00.297) 0:06:04.353 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Thursday 16 April 2026 19:28:28 -0400 (0:00:00.258) 0:06:04.611 ******** ok: [managed-node13] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Thursday 16 April 2026 19:28:28 -0400 (0:00:00.214) 0:06:04.826 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Thursday 16 April 2026 19:28:29 -0400 (0:00:00.241) 0:06:05.067 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Thursday 16 April 2026 19:28:29 -0400 (0:00:00.228) 0:06:05.296 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Thursday 16 April 2026 19:28:29 -0400 (0:00:00.204) 0:06:05.500 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Thursday 16 April 2026 19:28:29 -0400 (0:00:00.228) 0:06:05.728 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Thursday 16 April 2026 19:28:29 -0400 (0:00:00.180) 0:06:05.909 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Thursday 16 April 2026 19:28:30 -0400 (0:00:00.314) 0:06:06.223 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Thursday 16 April 2026 19:28:30 -0400 (0:00:00.199) 0:06:06.423 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Thursday 16 April 2026 19:28:30 -0400 (0:00:00.207) 0:06:06.631 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Thursday 16 April 2026 19:28:30 -0400 (0:00:00.313) 0:06:06.945 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Thursday 16 April 2026 19:28:31 -0400 (0:00:00.220) 0:06:07.165 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Thursday 16 April 2026 19:28:31 -0400 (0:00:00.241) 0:06:07.406 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Thursday 16 April 2026 19:28:31 -0400 (0:00:00.292) 0:06:07.699 ******** skipping: [managed-node13] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Thursday 16 April 2026 19:28:31 -0400 (0:00:00.250) 0:06:07.949 ******** skipping: [managed-node13] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Thursday 16 April 2026 19:28:32 -0400 (0:00:00.254) 0:06:08.204 ******** skipping: [managed-node13] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Thursday 16 April 2026 19:28:32 -0400 (0:00:00.253) 0:06:08.458 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Thursday 16 April 2026 19:28:32 -0400 (0:00:00.313) 0:06:08.772 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Thursday 16 April 2026 19:28:33 -0400 (0:00:00.287) 0:06:09.060 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Thursday 16 April 2026 19:28:33 -0400 (0:00:00.397) 0:06:09.457 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Thursday 16 April 2026 19:28:33 -0400 (0:00:00.313) 0:06:09.771 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Thursday 16 April 2026 19:28:34 -0400 (0:00:00.257) 0:06:10.028 ******** ok: [managed-node13] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Thursday 16 April 2026 19:28:34 -0400 (0:00:00.259) 0:06:10.288 ******** ok: [managed-node13] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Thursday 16 April 2026 19:28:34 -0400 (0:00:00.239) 0:06:10.528 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Thursday 16 April 2026 19:28:34 -0400 (0:00:00.247) 0:06:10.776 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Thursday 16 April 2026 19:28:34 -0400 (0:00:00.158) 0:06:10.934 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Thursday 16 April 2026 19:28:35 -0400 (0:00:00.220) 0:06:11.154 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Thursday 16 April 2026 19:28:35 -0400 (0:00:00.223) 0:06:11.378 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Thursday 16 April 2026 19:28:35 -0400 (0:00:00.134) 0:06:11.513 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Thursday 16 April 2026 19:28:35 -0400 (0:00:00.203) 0:06:11.716 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Thursday 16 April 2026 19:28:35 -0400 (0:00:00.239) 0:06:11.956 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Thursday 16 April 2026 19:28:36 -0400 (0:00:00.262) 0:06:12.218 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Thursday 16 April 2026 19:28:36 -0400 (0:00:00.790) 0:06:13.008 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Test for correct handling of new encrypted volume w/ no key - 2] ********* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:193 Thursday 16 April 2026 19:28:37 -0400 (0:00:00.182) 0:06:13.191 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node13 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Thursday 16 April 2026 19:28:37 -0400 (0:00:00.695) 0:06:13.886 ******** ok: [managed-node13] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Thursday 16 April 2026 19:28:38 -0400 (0:00:00.284) 0:06:14.170 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node13 TASK [Clear facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Thursday 16 April 2026 19:28:38 -0400 (0:00:00.231) 0:06:14.402 ******** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Thursday 16 April 2026 19:28:38 -0400 (0:00:00.021) 0:06:14.424 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Thursday 16 April 2026 19:28:38 -0400 (0:00:00.296) 0:06:14.720 ******** included: fedora.linux_system_roles.storage for managed-node13 TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 16 April 2026 19:28:39 -0400 (0:00:00.438) 0:06:15.158 ******** ok: [managed-node13] => { "ansible_facts": { "discovered_interpreter_python": "/usr/bin/python3.9" }, "changed": false, "message": "Message written to syslog" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:7 Thursday 16 April 2026 19:28:40 -0400 (0:00:01.308) 0:06:16.467 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 16 April 2026 19:28:40 -0400 (0:00:00.191) 0:06:16.659 ******** ok: [managed-node13] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 16 April 2026 19:28:42 -0400 (0:00:01.624) 0:06:18.284 ******** skipping: [managed-node13] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Thursday 16 April 2026 19:28:42 -0400 (0:00:00.507) 0:06:18.791 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Thursday 16 April 2026 19:28:43 -0400 (0:00:00.283) 0:06:19.075 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:10 Thursday 16 April 2026 19:28:43 -0400 (0:00:00.267) 0:06:19.343 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:14 Thursday 16 April 2026 19:28:43 -0400 (0:00:00.167) 0:06:19.511 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:18 Thursday 16 April 2026 19:28:43 -0400 (0:00:00.248) 0:06:19.759 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Thursday 16 April 2026 19:28:44 -0400 (0:00:00.681) 0:06:20.441 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Thursday 16 April 2026 19:28:44 -0400 (0:00:00.279) 0:06:20.720 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Thursday 16 April 2026 19:28:44 -0400 (0:00:00.302) 0:06:21.023 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Thursday 16 April 2026 19:28:46 -0400 (0:00:01.936) 0:06:22.960 ******** ok: [managed-node13] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Thursday 16 April 2026 19:28:47 -0400 (0:00:00.294) 0:06:23.254 ******** ok: [managed-node13] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Thursday 16 April 2026 19:28:47 -0400 (0:00:00.326) 0:06:23.581 ******** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Thursday 16 April 2026 19:28:49 -0400 (0:00:02.125) 0:06:25.706 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 16 April 2026 19:28:50 -0400 (0:00:00.441) 0:06:26.147 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 16 April 2026 19:28:50 -0400 (0:00:00.164) 0:06:26.312 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Thursday 16 April 2026 19:28:50 -0400 (0:00:00.162) 0:06:26.474 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Thursday 16 April 2026 19:28:50 -0400 (0:00:00.196) 0:06:26.671 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Thursday 16 April 2026 19:28:52 -0400 (0:00:01.825) 0:06:28.496 ******** ok: [managed-node13] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Thursday 16 April 2026 19:28:55 -0400 (0:00:02.650) 0:06:31.147 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Thursday 16 April 2026 19:28:55 -0400 (0:00:00.356) 0:06:31.503 ******** fatal: [managed-node13]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Thursday 16 April 2026 19:28:57 -0400 (0:00:02.040) 0:06:33.544 ******** fatal: [managed-node13]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "encrypted volume 'test1' missing key/password", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Thursday 16 April 2026 19:28:57 -0400 (0:00:00.325) 0:06:33.870 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Thursday 16 April 2026 19:28:58 -0400 (0:00:00.413) 0:06:34.283 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Thursday 16 April 2026 19:28:58 -0400 (0:00:00.199) 0:06:34.482 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Thursday 16 April 2026 19:28:58 -0400 (0:00:00.339) 0:06:34.821 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:212 Thursday 16 April 2026 19:28:59 -0400 (0:00:00.212) 0:06:35.033 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node13 TASK [Clear facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Thursday 16 April 2026 19:28:59 -0400 (0:00:00.526) 0:06:35.560 ******** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Thursday 16 April 2026 19:28:59 -0400 (0:00:00.013) 0:06:35.574 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Thursday 16 April 2026 19:28:59 -0400 (0:00:00.129) 0:06:35.703 ******** included: fedora.linux_system_roles.storage for managed-node13 TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 16 April 2026 19:28:59 -0400 (0:00:00.286) 0:06:35.989 ******** ok: [managed-node13] => { "ansible_facts": { "discovered_interpreter_python": "/usr/bin/python3.9" }, "changed": false, "message": "Message written to syslog" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:7 Thursday 16 April 2026 19:29:01 -0400 (0:00:01.228) 0:06:37.218 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 16 April 2026 19:29:01 -0400 (0:00:00.274) 0:06:37.493 ******** ok: [managed-node13] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 16 April 2026 19:29:03 -0400 (0:00:01.713) 0:06:39.207 ******** skipping: [managed-node13] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Thursday 16 April 2026 19:29:04 -0400 (0:00:00.951) 0:06:40.158 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Thursday 16 April 2026 19:29:04 -0400 (0:00:00.303) 0:06:40.462 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:10 Thursday 16 April 2026 19:29:04 -0400 (0:00:00.235) 0:06:40.698 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:14 Thursday 16 April 2026 19:29:04 -0400 (0:00:00.213) 0:06:40.911 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:18 Thursday 16 April 2026 19:29:05 -0400 (0:00:00.214) 0:06:41.126 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Thursday 16 April 2026 19:29:05 -0400 (0:00:00.770) 0:06:41.897 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Thursday 16 April 2026 19:29:06 -0400 (0:00:00.161) 0:06:42.058 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Thursday 16 April 2026 19:29:06 -0400 (0:00:00.182) 0:06:42.241 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Thursday 16 April 2026 19:29:07 -0400 (0:00:01.755) 0:06:43.997 ******** ok: [managed-node13] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Thursday 16 April 2026 19:29:08 -0400 (0:00:00.331) 0:06:44.328 ******** ok: [managed-node13] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Thursday 16 April 2026 19:29:08 -0400 (0:00:00.287) 0:06:44.615 ******** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Thursday 16 April 2026 19:29:10 -0400 (0:00:02.024) 0:06:46.639 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 16 April 2026 19:29:11 -0400 (0:00:00.472) 0:06:47.112 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 16 April 2026 19:29:11 -0400 (0:00:00.194) 0:06:47.306 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Thursday 16 April 2026 19:29:11 -0400 (0:00:00.225) 0:06:47.532 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Thursday 16 April 2026 19:29:11 -0400 (0:00:00.187) 0:06:47.719 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Thursday 16 April 2026 19:29:13 -0400 (0:00:01.943) 0:06:49.663 ******** ok: [managed-node13] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Thursday 16 April 2026 19:29:16 -0400 (0:00:02.676) 0:06:52.339 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Thursday 16 April 2026 19:29:16 -0400 (0:00:00.407) 0:06:52.747 ******** changed: [managed-node13] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-0ac178a1-61aa-4226-b754-53f41de902ef", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Thursday 16 April 2026 19:29:28 -0400 (0:00:12.169) 0:07:04.917 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Thursday 16 April 2026 19:29:29 -0400 (0:00:00.187) 0:07:05.104 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382073.7977228, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e5b31e7c1a2d9dd24586e8357168c69b10913696", "ctime": 1776382073.7937226, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 218104009, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776382073.7937226, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1490, "uid": 0, "version": "292099060", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Thursday 16 April 2026 19:29:30 -0400 (0:00:01.229) 0:07:06.333 ******** ok: [managed-node13] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Thursday 16 April 2026 19:29:31 -0400 (0:00:01.298) 0:07:07.632 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Thursday 16 April 2026 19:29:32 -0400 (0:00:00.580) 0:07:08.213 ******** ok: [managed-node13] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-0ac178a1-61aa-4226-b754-53f41de902ef", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 16 April 2026 19:29:32 -0400 (0:00:00.340) 0:07:08.553 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Thursday 16 April 2026 19:29:32 -0400 (0:00:00.304) 0:07:08.858 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Thursday 16 April 2026 19:29:33 -0400 (0:00:00.247) 0:07:09.105 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node13] => (item={'src': '/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-0ac178a1-61aa-4226-b754-53f41de902ef" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Thursday 16 April 2026 19:29:34 -0400 (0:00:01.588) 0:07:10.694 ******** ok: [managed-node13] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Thursday 16 April 2026 19:29:36 -0400 (0:00:01.763) 0:07:12.457 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node13] => (item={'src': '/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Thursday 16 April 2026 19:29:37 -0400 (0:00:01.446) 0:07:13.904 ******** skipping: [managed-node13] => (item={'src': '/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Thursday 16 April 2026 19:29:38 -0400 (0:00:00.321) 0:07:14.225 ******** ok: [managed-node13] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Thursday 16 April 2026 19:29:39 -0400 (0:00:01.607) 0:07:15.833 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382085.6687257, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "db16cb3eb7343f5135279765b44a6e79b0729c09", "ctime": 1776382078.6127238, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 218104010, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776382078.6135828, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "1123077803", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Thursday 16 April 2026 19:29:41 -0400 (0:00:01.232) 0:07:17.066 ******** changed: [managed-node13] => (item={'backing_device': '/dev/sda', 'name': 'luks-0ac178a1-61aa-4226-b754-53f41de902ef', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-0ac178a1-61aa-4226-b754-53f41de902ef", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node13] => (item={'backing_device': '/dev/sda1', 'name': 'luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Thursday 16 April 2026 19:29:43 -0400 (0:00:02.738) 0:07:19.804 ******** ok: [managed-node13] TASK [Verify role results - 4] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:228 Thursday 16 April 2026 19:29:45 -0400 (0:00:02.136) 0:07:21.941 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node13 TASK [Print out pool information] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Thursday 16 April 2026 19:29:46 -0400 (0:00:01.078) 0:07:23.019 ******** ok: [managed-node13] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Thursday 16 April 2026 19:29:47 -0400 (0:00:00.449) 0:07:23.469 ******** skipping: [managed-node13] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Thursday 16 April 2026 19:29:47 -0400 (0:00:00.240) 0:07:23.709 ******** ok: [managed-node13] => { "changed": false, "info": { "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "size": "4G", "type": "crypt", "uuid": "1fdcaf56-5d39-4376-b16e-a21d639dab11" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "2faf63cf-a8c6-4dd2-b395-d9542a5f7a02" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fb6c01b6-de24-4b5b-a70c-3231ceae3fbd" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Thursday 16 April 2026 19:29:48 -0400 (0:00:01.275) 0:07:24.984 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003042", "end": "2026-04-16 19:29:49.901568", "rc": 0, "start": "2026-04-16 19:29:49.898526" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Apr 14 06:59:53 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fb6c01b6-de24-4b5b-a70c-3231ceae3fbd / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Thursday 16 April 2026 19:29:50 -0400 (0:00:01.096) 0:07:26.081 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003104", "end": "2026-04-16 19:29:51.180337", "failed_when_result": false, "rc": 0, "start": "2026-04-16 19:29:51.177233" } STDOUT: luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02 /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Thursday 16 April 2026 19:29:51 -0400 (0:00:01.311) 0:07:27.392 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node13 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Thursday 16 April 2026 19:29:51 -0400 (0:00:00.555) 0:07:27.948 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Thursday 16 April 2026 19:29:52 -0400 (0:00:00.219) 0:07:28.168 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Thursday 16 April 2026 19:29:52 -0400 (0:00:00.193) 0:07:28.361 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Thursday 16 April 2026 19:29:52 -0400 (0:00:00.242) 0:07:28.604 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node13 => (item=members) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node13 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Thursday 16 April 2026 19:29:53 -0400 (0:00:00.618) 0:07:29.222 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Thursday 16 April 2026 19:29:53 -0400 (0:00:00.169) 0:07:29.391 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Thursday 16 April 2026 19:29:53 -0400 (0:00:00.297) 0:07:29.688 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Thursday 16 April 2026 19:29:53 -0400 (0:00:00.244) 0:07:29.932 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Thursday 16 April 2026 19:29:54 -0400 (0:00:00.235) 0:07:30.168 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Thursday 16 April 2026 19:29:54 -0400 (0:00:00.174) 0:07:30.343 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Thursday 16 April 2026 19:29:54 -0400 (0:00:00.168) 0:07:30.511 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Thursday 16 April 2026 19:29:54 -0400 (0:00:00.105) 0:07:30.617 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Thursday 16 April 2026 19:29:54 -0400 (0:00:00.182) 0:07:30.800 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Thursday 16 April 2026 19:29:54 -0400 (0:00:00.146) 0:07:30.946 ******** ok: [managed-node13] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:86758): WARNING **: 19:29:56.032: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.5 27 Jan 2026 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.43.82 originally 10.31.43.82 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.43.82 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.43.82 originally 10.31.43.82 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/983a9e969b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.43.82 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Thursday 16 April 2026 19:29:56 -0400 (0:00:01.356) 0:07:32.303 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Thursday 16 April 2026 19:29:56 -0400 (0:00:00.240) 0:07:32.544 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node13 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Thursday 16 April 2026 19:29:57 -0400 (0:00:00.689) 0:07:33.233 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Thursday 16 April 2026 19:29:57 -0400 (0:00:00.232) 0:07:33.466 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Thursday 16 April 2026 19:29:57 -0400 (0:00:00.322) 0:07:33.789 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Thursday 16 April 2026 19:29:57 -0400 (0:00:00.185) 0:07:33.974 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Thursday 16 April 2026 19:29:58 -0400 (0:00:00.153) 0:07:34.128 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Thursday 16 April 2026 19:29:58 -0400 (0:00:00.254) 0:07:34.382 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Thursday 16 April 2026 19:29:58 -0400 (0:00:00.231) 0:07:34.614 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Thursday 16 April 2026 19:29:58 -0400 (0:00:00.254) 0:07:34.868 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Thursday 16 April 2026 19:29:59 -0400 (0:00:00.181) 0:07:35.050 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Thursday 16 April 2026 19:29:59 -0400 (0:00:00.185) 0:07:35.235 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Thursday 16 April 2026 19:29:59 -0400 (0:00:00.237) 0:07:35.473 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Thursday 16 April 2026 19:29:59 -0400 (0:00:00.315) 0:07:35.788 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node13 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Thursday 16 April 2026 19:30:01 -0400 (0:00:01.502) 0:07:37.291 ******** skipping: [managed-node13] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Thursday 16 April 2026 19:30:01 -0400 (0:00:00.318) 0:07:37.610 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node13 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Thursday 16 April 2026 19:30:02 -0400 (0:00:00.664) 0:07:38.275 ******** skipping: [managed-node13] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Thursday 16 April 2026 19:30:02 -0400 (0:00:00.289) 0:07:38.564 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node13 TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Thursday 16 April 2026 19:30:03 -0400 (0:00:00.710) 0:07:39.275 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Thursday 16 April 2026 19:30:03 -0400 (0:00:00.300) 0:07:39.575 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Thursday 16 April 2026 19:30:03 -0400 (0:00:00.361) 0:07:39.936 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Thursday 16 April 2026 19:30:04 -0400 (0:00:00.168) 0:07:40.105 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Thursday 16 April 2026 19:30:04 -0400 (0:00:00.286) 0:07:40.391 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node13 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Thursday 16 April 2026 19:30:05 -0400 (0:00:00.688) 0:07:41.080 ******** skipping: [managed-node13] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Thursday 16 April 2026 19:30:05 -0400 (0:00:00.317) 0:07:41.398 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node13 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Thursday 16 April 2026 19:30:06 -0400 (0:00:00.781) 0:07:42.180 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Thursday 16 April 2026 19:30:06 -0400 (0:00:00.164) 0:07:42.345 ******** skipping: [managed-node13] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Thursday 16 April 2026 19:30:06 -0400 (0:00:00.261) 0:07:42.606 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Thursday 16 April 2026 19:30:06 -0400 (0:00:00.267) 0:07:42.874 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Thursday 16 April 2026 19:30:07 -0400 (0:00:00.199) 0:07:43.073 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Thursday 16 April 2026 19:30:07 -0400 (0:00:00.179) 0:07:43.252 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Thursday 16 April 2026 19:30:07 -0400 (0:00:00.213) 0:07:43.465 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Thursday 16 April 2026 19:30:07 -0400 (0:00:00.257) 0:07:43.723 ******** ok: [managed-node13] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Thursday 16 April 2026 19:30:07 -0400 (0:00:00.279) 0:07:44.003 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node13 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Thursday 16 April 2026 19:30:08 -0400 (0:00:00.430) 0:07:44.434 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Thursday 16 April 2026 19:30:08 -0400 (0:00:00.329) 0:07:44.763 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node13 => (item=mount) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node13 => (item=fstab) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node13 => (item=fs) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node13 => (item=device) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node13 => (item=encryption) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node13 => (item=md) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node13 => (item=size) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node13 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Thursday 16 April 2026 19:30:10 -0400 (0:00:02.103) 0:07:46.866 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Thursday 16 April 2026 19:30:11 -0400 (0:00:00.433) 0:07:47.300 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Thursday 16 April 2026 19:30:11 -0400 (0:00:00.411) 0:07:47.711 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Thursday 16 April 2026 19:30:12 -0400 (0:00:00.512) 0:07:48.224 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Thursday 16 April 2026 19:30:12 -0400 (0:00:00.267) 0:07:48.491 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Thursday 16 April 2026 19:30:12 -0400 (0:00:00.439) 0:07:48.931 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Thursday 16 April 2026 19:30:13 -0400 (0:00:00.327) 0:07:49.259 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Thursday 16 April 2026 19:30:13 -0400 (0:00:00.372) 0:07:49.632 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Thursday 16 April 2026 19:30:13 -0400 (0:00:00.239) 0:07:49.872 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Thursday 16 April 2026 19:30:14 -0400 (0:00:00.248) 0:07:50.120 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Thursday 16 April 2026 19:30:14 -0400 (0:00:00.149) 0:07:50.269 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Thursday 16 April 2026 19:30:14 -0400 (0:00:00.169) 0:07:50.438 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Thursday 16 April 2026 19:30:15 -0400 (0:00:00.655) 0:07:51.094 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Thursday 16 April 2026 19:30:15 -0400 (0:00:00.345) 0:07:51.439 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Thursday 16 April 2026 19:30:15 -0400 (0:00:00.278) 0:07:51.718 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Thursday 16 April 2026 19:30:15 -0400 (0:00:00.192) 0:07:51.911 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Thursday 16 April 2026 19:30:16 -0400 (0:00:00.352) 0:07:52.263 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Thursday 16 April 2026 19:30:16 -0400 (0:00:00.246) 0:07:52.510 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Thursday 16 April 2026 19:30:16 -0400 (0:00:00.337) 0:07:52.848 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Thursday 16 April 2026 19:30:17 -0400 (0:00:00.304) 0:07:53.153 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382168.3647447, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1776382168.3647447, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1237, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776382168.3647447, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Thursday 16 April 2026 19:30:18 -0400 (0:00:01.195) 0:07:54.349 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Thursday 16 April 2026 19:30:18 -0400 (0:00:00.416) 0:07:54.765 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Thursday 16 April 2026 19:30:19 -0400 (0:00:00.336) 0:07:55.102 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Thursday 16 April 2026 19:30:19 -0400 (0:00:00.342) 0:07:55.445 ******** ok: [managed-node13] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Thursday 16 April 2026 19:30:19 -0400 (0:00:00.272) 0:07:55.717 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Thursday 16 April 2026 19:30:19 -0400 (0:00:00.193) 0:07:55.911 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Thursday 16 April 2026 19:30:20 -0400 (0:00:00.340) 0:07:56.251 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382168.6127448, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1776382168.6127448, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1306, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776382168.6127448, "nlink": 1, "path": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Thursday 16 April 2026 19:30:21 -0400 (0:00:01.338) 0:07:57.590 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Thursday 16 April 2026 19:30:23 -0400 (0:00:01.956) 0:07:59.547 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.007177", "end": "2026-04-16 19:30:24.546208", "rc": 0, "start": "2026-04-16 19:30:24.539031" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 2faf63cf-a8c6-4dd2-b395-d9542a5f7a02 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 686240 Threads: 2 Salt: 50 a3 b4 0c 29 fc 04 6f 48 d5 b2 bb 4a f3 9e 20 85 68 7f 53 6d 4a 05 53 ff b6 a4 6a 6d c5 ba 38 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 131466 Salt: fd 7f dd ac 56 13 b6 bf 22 0f 72 21 af 65 0f f4 8c 05 e6 af 38 1c 90 92 8a 74 e2 c8 9d ea 3a aa Digest: 52 a9 0d 26 a2 50 ec ea 6b f9 9e 63 14 2c c4 70 96 87 cd a8 00 c9 88 25 34 21 52 d2 31 52 11 af TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Thursday 16 April 2026 19:30:24 -0400 (0:00:01.180) 0:08:00.728 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Thursday 16 April 2026 19:30:25 -0400 (0:00:00.361) 0:08:01.089 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Thursday 16 April 2026 19:30:25 -0400 (0:00:00.310) 0:08:01.399 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Thursday 16 April 2026 19:30:25 -0400 (0:00:00.365) 0:08:01.765 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Thursday 16 April 2026 19:30:26 -0400 (0:00:00.435) 0:08:02.201 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.encryption_luks_version is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Thursday 16 April 2026 19:30:26 -0400 (0:00:00.401) 0:08:02.602 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Thursday 16 April 2026 19:30:26 -0400 (0:00:00.345) 0:08:02.947 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Thursday 16 April 2026 19:30:27 -0400 (0:00:00.297) 0:08:03.245 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02 /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Thursday 16 April 2026 19:30:27 -0400 (0:00:00.375) 0:08:03.621 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Thursday 16 April 2026 19:30:27 -0400 (0:00:00.365) 0:08:03.987 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Thursday 16 April 2026 19:30:28 -0400 (0:00:00.361) 0:08:04.375 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Thursday 16 April 2026 19:30:28 -0400 (0:00:00.385) 0:08:04.761 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Thursday 16 April 2026 19:30:29 -0400 (0:00:00.357) 0:08:05.119 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Thursday 16 April 2026 19:30:29 -0400 (0:00:00.238) 0:08:05.358 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Thursday 16 April 2026 19:30:29 -0400 (0:00:00.261) 0:08:05.619 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Thursday 16 April 2026 19:30:29 -0400 (0:00:00.316) 0:08:05.936 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Thursday 16 April 2026 19:30:30 -0400 (0:00:00.279) 0:08:06.215 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Thursday 16 April 2026 19:30:30 -0400 (0:00:00.222) 0:08:06.438 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Thursday 16 April 2026 19:30:30 -0400 (0:00:00.236) 0:08:06.674 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Thursday 16 April 2026 19:30:30 -0400 (0:00:00.173) 0:08:06.848 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Thursday 16 April 2026 19:30:31 -0400 (0:00:00.180) 0:08:07.028 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Thursday 16 April 2026 19:30:31 -0400 (0:00:00.248) 0:08:07.277 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Thursday 16 April 2026 19:30:31 -0400 (0:00:00.247) 0:08:07.524 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Thursday 16 April 2026 19:30:31 -0400 (0:00:00.264) 0:08:07.789 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Thursday 16 April 2026 19:30:32 -0400 (0:00:00.284) 0:08:08.073 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Thursday 16 April 2026 19:30:32 -0400 (0:00:00.196) 0:08:08.270 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Thursday 16 April 2026 19:30:32 -0400 (0:00:00.239) 0:08:08.510 ******** ok: [managed-node13] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Thursday 16 April 2026 19:30:32 -0400 (0:00:00.248) 0:08:08.759 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Thursday 16 April 2026 19:30:32 -0400 (0:00:00.221) 0:08:08.981 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Thursday 16 April 2026 19:30:33 -0400 (0:00:00.223) 0:08:09.205 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Thursday 16 April 2026 19:30:33 -0400 (0:00:00.245) 0:08:09.450 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Thursday 16 April 2026 19:30:33 -0400 (0:00:00.266) 0:08:09.717 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Thursday 16 April 2026 19:30:33 -0400 (0:00:00.264) 0:08:09.981 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Thursday 16 April 2026 19:30:34 -0400 (0:00:00.323) 0:08:10.305 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Thursday 16 April 2026 19:30:34 -0400 (0:00:00.354) 0:08:10.659 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Thursday 16 April 2026 19:30:35 -0400 (0:00:00.375) 0:08:11.035 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Thursday 16 April 2026 19:30:35 -0400 (0:00:00.250) 0:08:11.286 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Thursday 16 April 2026 19:30:35 -0400 (0:00:00.419) 0:08:11.706 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Thursday 16 April 2026 19:30:35 -0400 (0:00:00.247) 0:08:11.953 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Thursday 16 April 2026 19:30:36 -0400 (0:00:00.337) 0:08:12.291 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Thursday 16 April 2026 19:30:36 -0400 (0:00:00.300) 0:08:12.592 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Thursday 16 April 2026 19:30:36 -0400 (0:00:00.318) 0:08:12.911 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Thursday 16 April 2026 19:30:37 -0400 (0:00:00.296) 0:08:13.208 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Thursday 16 April 2026 19:30:37 -0400 (0:00:00.395) 0:08:13.603 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Thursday 16 April 2026 19:30:37 -0400 (0:00:00.200) 0:08:13.804 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Thursday 16 April 2026 19:30:38 -0400 (0:00:00.326) 0:08:14.131 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Thursday 16 April 2026 19:30:38 -0400 (0:00:00.307) 0:08:14.439 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Thursday 16 April 2026 19:30:38 -0400 (0:00:00.315) 0:08:14.755 ******** ok: [managed-node13] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Thursday 16 April 2026 19:30:39 -0400 (0:00:00.351) 0:08:15.110 ******** ok: [managed-node13] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Thursday 16 April 2026 19:30:39 -0400 (0:00:00.332) 0:08:15.442 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Thursday 16 April 2026 19:30:39 -0400 (0:00:00.291) 0:08:15.734 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Thursday 16 April 2026 19:30:39 -0400 (0:00:00.284) 0:08:16.019 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Thursday 16 April 2026 19:30:40 -0400 (0:00:00.220) 0:08:16.240 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Thursday 16 April 2026 19:30:40 -0400 (0:00:00.230) 0:08:16.470 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Thursday 16 April 2026 19:30:40 -0400 (0:00:00.214) 0:08:16.685 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Thursday 16 April 2026 19:30:40 -0400 (0:00:00.220) 0:08:16.906 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Thursday 16 April 2026 19:30:41 -0400 (0:00:00.293) 0:08:17.199 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Thursday 16 April 2026 19:30:41 -0400 (0:00:00.209) 0:08:17.409 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Thursday 16 April 2026 19:30:41 -0400 (0:00:00.277) 0:08:17.686 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Thursday 16 April 2026 19:30:41 -0400 (0:00:00.306) 0:08:17.993 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Thursday 16 April 2026 19:30:42 -0400 (0:00:00.201) 0:08:18.194 ******** changed: [managed-node13] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 3] ****************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:234 Thursday 16 April 2026 19:30:43 -0400 (0:00:01.287) 0:08:19.481 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node13 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Thursday 16 April 2026 19:30:44 -0400 (0:00:01.085) 0:08:20.567 ******** ok: [managed-node13] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Thursday 16 April 2026 19:30:44 -0400 (0:00:00.415) 0:08:20.983 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node13 TASK [Clear facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Thursday 16 April 2026 19:30:45 -0400 (0:00:00.334) 0:08:21.317 ******** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Thursday 16 April 2026 19:30:45 -0400 (0:00:00.024) 0:08:21.342 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Thursday 16 April 2026 19:30:45 -0400 (0:00:00.250) 0:08:21.592 ******** included: fedora.linux_system_roles.storage for managed-node13 TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 16 April 2026 19:30:46 -0400 (0:00:00.471) 0:08:22.064 ******** ok: [managed-node13] => { "ansible_facts": { "discovered_interpreter_python": "/usr/bin/python3.9" }, "changed": false, "message": "Message written to syslog" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:7 Thursday 16 April 2026 19:30:47 -0400 (0:00:01.436) 0:08:23.500 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 16 April 2026 19:30:48 -0400 (0:00:01.215) 0:08:24.716 ******** ok: [managed-node13] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 16 April 2026 19:30:50 -0400 (0:00:01.520) 0:08:26.236 ******** skipping: [managed-node13] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Thursday 16 April 2026 19:30:51 -0400 (0:00:00.837) 0:08:27.074 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Thursday 16 April 2026 19:30:51 -0400 (0:00:00.296) 0:08:27.371 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:10 Thursday 16 April 2026 19:30:51 -0400 (0:00:00.254) 0:08:27.625 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:14 Thursday 16 April 2026 19:30:51 -0400 (0:00:00.223) 0:08:27.848 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:18 Thursday 16 April 2026 19:30:52 -0400 (0:00:00.193) 0:08:28.042 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Thursday 16 April 2026 19:30:52 -0400 (0:00:00.881) 0:08:28.923 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Thursday 16 April 2026 19:30:53 -0400 (0:00:00.338) 0:08:29.262 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Thursday 16 April 2026 19:30:53 -0400 (0:00:00.258) 0:08:29.520 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Thursday 16 April 2026 19:30:55 -0400 (0:00:01.898) 0:08:31.419 ******** ok: [managed-node13] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Thursday 16 April 2026 19:30:55 -0400 (0:00:00.342) 0:08:31.761 ******** ok: [managed-node13] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Thursday 16 April 2026 19:30:56 -0400 (0:00:00.341) 0:08:32.103 ******** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Thursday 16 April 2026 19:30:58 -0400 (0:00:02.099) 0:08:34.203 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 16 April 2026 19:30:58 -0400 (0:00:00.431) 0:08:34.635 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 16 April 2026 19:30:58 -0400 (0:00:00.243) 0:08:34.878 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Thursday 16 April 2026 19:30:59 -0400 (0:00:00.256) 0:08:35.135 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Thursday 16 April 2026 19:30:59 -0400 (0:00:00.218) 0:08:35.353 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Thursday 16 April 2026 19:31:00 -0400 (0:00:01.654) 0:08:37.008 ******** ok: [managed-node13] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d0ac178a1\\x2d61aa\\x2d4226\\x2db754\\x2d53f41de902ef.service": { "name": "systemd-cryptsetup@luks\\x2d0ac178a1\\x2d61aa\\x2d4226\\x2db754\\x2d53f41de902ef.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Thursday 16 April 2026 19:31:03 -0400 (0:00:03.002) 0:08:40.010 ******** changed: [managed-node13] => (item=systemd-cryptsetup@luks\x2d0ac178a1\x2d61aa\x2d4226\x2db754\x2d53f41de902ef.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d0ac178a1\\x2d61aa\\x2d4226\\x2db754\\x2d53f41de902ef.service", "name": "systemd-cryptsetup@luks\\x2d0ac178a1\\x2d61aa\\x2d4226\\x2db754\\x2d53f41de902ef.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket \"system-systemd\\\\x2dcryptsetup.slice\" dev-sda.device cryptsetup-pre.target systemd-udevd-kernel.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "\"blockdev@dev-mapper-luks\\\\x2d0ac178a1\\\\x2d61aa\\\\x2d4226\\\\x2db754\\\\x2d53f41de902ef.target\" cryptsetup.target umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-0ac178a1-61aa-4226-b754-53f41de902ef", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-0ac178a1-61aa-4226-b754-53f41de902ef /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-0ac178a1-61aa-4226-b754-53f41de902ef /dev/sda - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-0ac178a1-61aa-4226-b754-53f41de902ef ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-0ac178a1-61aa-4226-b754-53f41de902ef ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d0ac178a1\\x2d61aa\\x2d4226\\x2db754\\x2d53f41de902ef.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d0ac178a1\\x2d61aa\\x2d4226\\x2db754\\x2d53f41de902ef.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13683", "LimitNPROCSoft": "13683", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13683", "LimitSIGPENDINGSoft": "13683", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d0ac178a1\\\\x2d61aa\\\\x2d4226\\\\x2db754\\\\x2d53f41de902ef.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2026-04-16 19:29:39 EDT", "StateChangeTimestampMonotonic": "1990038334", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21893", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d0ac178a1\\\\x2d61aa\\\\x2d4226\\\\x2db754\\\\x2d53f41de902ef.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Thursday 16 April 2026 19:31:06 -0400 (0:00:02.118) 0:08:42.129 ******** fatal: [managed-node13]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Thursday 16 April 2026 19:31:08 -0400 (0:00:02.204) 0:08:44.333 ******** fatal: [managed-node13]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02' in safe mode due to encryption removal", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Thursday 16 April 2026 19:31:08 -0400 (0:00:00.303) 0:08:44.637 ******** changed: [managed-node13] => (item=systemd-cryptsetup@luks\x2d0ac178a1\x2d61aa\x2d4226\x2db754\x2d53f41de902ef.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d0ac178a1\\x2d61aa\\x2d4226\\x2db754\\x2d53f41de902ef.service", "name": "systemd-cryptsetup@luks\\x2d0ac178a1\\x2d61aa\\x2d4226\\x2db754\\x2d53f41de902ef.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d0ac178a1\\x2d61aa\\x2d4226\\x2db754\\x2d53f41de902ef.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d0ac178a1\\x2d61aa\\x2d4226\\x2db754\\x2d53f41de902ef.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d0ac178a1\\x2d61aa\\x2d4226\\x2db754\\x2d53f41de902ef.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454168576", "LimitMEMLOCKSoft": "454168576", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13683", "LimitNPROCSoft": "13683", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13683", "LimitSIGPENDINGSoft": "13683", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d0ac178a1\\x2d61aa\\x2d4226\\x2db754\\x2d53f41de902ef.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d0ac178a1\\\\x2d61aa\\\\x2d4226\\\\x2db754\\\\x2d53f41de902ef.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21893", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Thursday 16 April 2026 19:31:10 -0400 (0:00:01.897) 0:08:46.534 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Thursday 16 April 2026 19:31:10 -0400 (0:00:00.286) 0:08:46.821 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Thursday 16 April 2026 19:31:11 -0400 (0:00:00.430) 0:08:47.252 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Thursday 16 April 2026 19:31:11 -0400 (0:00:00.263) 0:08:47.515 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382243.320762, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776382243.320762, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776382243.320762, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2271752031", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Thursday 16 April 2026 19:31:12 -0400 (0:00:00.961) 0:08:48.476 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 2] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:258 Thursday 16 April 2026 19:31:12 -0400 (0:00:00.321) 0:08:48.798 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node13 TASK [Clear facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Thursday 16 April 2026 19:31:13 -0400 (0:00:00.891) 0:08:49.689 ******** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Thursday 16 April 2026 19:31:13 -0400 (0:00:00.028) 0:08:49.718 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Thursday 16 April 2026 19:31:13 -0400 (0:00:00.282) 0:08:50.000 ******** included: fedora.linux_system_roles.storage for managed-node13 TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 16 April 2026 19:31:14 -0400 (0:00:00.460) 0:08:50.461 ******** ok: [managed-node13] => { "ansible_facts": { "discovered_interpreter_python": "/usr/bin/python3.9" }, "changed": false, "message": "Message written to syslog" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:7 Thursday 16 April 2026 19:31:15 -0400 (0:00:01.494) 0:08:51.955 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 16 April 2026 19:31:16 -0400 (0:00:00.241) 0:08:52.197 ******** ok: [managed-node13] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 16 April 2026 19:31:18 -0400 (0:00:01.851) 0:08:54.048 ******** skipping: [managed-node13] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Thursday 16 April 2026 19:31:18 -0400 (0:00:00.758) 0:08:54.807 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Thursday 16 April 2026 19:31:19 -0400 (0:00:00.313) 0:08:55.120 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:10 Thursday 16 April 2026 19:31:19 -0400 (0:00:00.221) 0:08:55.342 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:14 Thursday 16 April 2026 19:31:19 -0400 (0:00:00.228) 0:08:55.570 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:18 Thursday 16 April 2026 19:31:19 -0400 (0:00:00.288) 0:08:55.860 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Thursday 16 April 2026 19:31:20 -0400 (0:00:00.820) 0:08:56.680 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Thursday 16 April 2026 19:31:20 -0400 (0:00:00.330) 0:08:57.011 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Thursday 16 April 2026 19:31:21 -0400 (0:00:00.306) 0:08:57.317 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Thursday 16 April 2026 19:31:23 -0400 (0:00:01.978) 0:08:59.296 ******** ok: [managed-node13] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Thursday 16 April 2026 19:31:23 -0400 (0:00:00.391) 0:08:59.687 ******** ok: [managed-node13] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Thursday 16 April 2026 19:31:23 -0400 (0:00:00.239) 0:08:59.927 ******** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Thursday 16 April 2026 19:31:26 -0400 (0:00:02.156) 0:09:02.084 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 16 April 2026 19:31:26 -0400 (0:00:00.403) 0:09:02.488 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 16 April 2026 19:31:26 -0400 (0:00:00.145) 0:09:02.633 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Thursday 16 April 2026 19:31:26 -0400 (0:00:00.243) 0:09:02.879 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Thursday 16 April 2026 19:31:27 -0400 (0:00:00.184) 0:09:03.063 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Thursday 16 April 2026 19:31:28 -0400 (0:00:01.960) 0:09:05.024 ******** ok: [managed-node13] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service": { "name": "systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Thursday 16 April 2026 19:31:34 -0400 (0:00:05.095) 0:09:10.120 ******** changed: [managed-node13] => (item=systemd-cryptsetup@luks\x2d2faf63cf\x2da8c6\x2d4dd2\x2db395\x2dd9542a5f7a02.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service", "name": "systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target systemd-udevd-kernel.socket systemd-journald.socket dev-sda1.device \"system-systemd\\\\x2dcryptsetup.slice\"", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target \"blockdev@dev-mapper-luks\\\\x2d2faf63cf\\\\x2da8c6\\\\x2d4dd2\\\\x2db395\\\\x2dd9542a5f7a02.target\" cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02 /dev/sda1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13683", "LimitNPROCSoft": "13683", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13683", "LimitSIGPENDINGSoft": "13683", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d2faf63cf\\\\x2da8c6\\\\x2d4dd2\\\\x2db395\\\\x2dd9542a5f7a02.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "\"dev-mapper-luks\\\\x2d2faf63cf\\\\x2da8c6\\\\x2d4dd2\\\\x2db395\\\\x2dd9542a5f7a02.device\" cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2026-04-16 19:31:10 EDT", "StateChangeTimestampMonotonic": "2080726688", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21893", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d2faf63cf\\\\x2da8c6\\\\x2d4dd2\\\\x2db395\\\\x2dd9542a5f7a02.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Thursday 16 April 2026 19:31:36 -0400 (0:00:02.155) 0:09:12.275 ******** changed: [managed-node13] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sdb", "/dev/sda1", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Thursday 16 April 2026 19:31:39 -0400 (0:00:03.041) 0:09:15.317 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Thursday 16 April 2026 19:31:39 -0400 (0:00:00.309) 0:09:15.626 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382177.730747, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "4cd2a91c5b67030d9b3d3e28b3af44fd19550645", "ctime": 1776382177.727747, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 218104009, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776382177.727747, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1490, "uid": 0, "version": "292099060", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Thursday 16 April 2026 19:31:40 -0400 (0:00:01.349) 0:09:16.976 ******** ok: [managed-node13] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Thursday 16 April 2026 19:31:42 -0400 (0:00:01.269) 0:09:18.245 ******** changed: [managed-node13] => (item=systemd-cryptsetup@luks\x2d2faf63cf\x2da8c6\x2d4dd2\x2db395\x2dd9542a5f7a02.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service", "name": "systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454168576", "LimitMEMLOCKSoft": "454168576", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13683", "LimitNPROCSoft": "13683", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13683", "LimitSIGPENDINGSoft": "13683", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d2faf63cf\\\\x2da8c6\\\\x2d4dd2\\\\x2db395\\\\x2dd9542a5f7a02.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target \"dev-mapper-luks\\\\x2d2faf63cf\\\\x2da8c6\\\\x2d4dd2\\\\x2db395\\\\x2dd9542a5f7a02.device\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2026-04-16 19:31:10 EDT", "StateChangeTimestampMonotonic": "2080726688", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21893", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Thursday 16 April 2026 19:31:44 -0400 (0:00:02.256) 0:09:20.502 ******** ok: [managed-node13] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sda1", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29", "state": "mounted" } ], "packages": [ "xfsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 16 April 2026 19:31:44 -0400 (0:00:00.365) 0:09:20.867 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Thursday 16 April 2026 19:31:45 -0400 (0:00:00.322) 0:09:21.189 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Thursday 16 April 2026 19:31:45 -0400 (0:00:00.321) 0:09:21.543 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node13] => (item={'src': '/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Thursday 16 April 2026 19:31:47 -0400 (0:00:01.660) 0:09:23.203 ******** ok: [managed-node13] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Thursday 16 April 2026 19:31:48 -0400 (0:00:01.725) 0:09:24.928 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node13] => (item={'src': 'UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Thursday 16 April 2026 19:31:50 -0400 (0:00:01.629) 0:09:26.558 ******** skipping: [managed-node13] => (item={'src': 'UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Thursday 16 April 2026 19:31:51 -0400 (0:00:00.537) 0:09:27.095 ******** ok: [managed-node13] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Thursday 16 April 2026 19:31:52 -0400 (0:00:01.662) 0:09:28.758 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382191.17975, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "c6796b9cd52221971e074c3c6d284ba191d2e8e8", "ctime": 1776382183.6097484, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 482345162, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776382183.61009, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "3917043845", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Thursday 16 April 2026 19:31:53 -0400 (0:00:01.218) 0:09:29.977 ******** changed: [managed-node13] => (item={'backing_device': '/dev/sda1', 'name': 'luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Thursday 16 April 2026 19:31:55 -0400 (0:00:01.415) 0:09:31.393 ******** ok: [managed-node13] TASK [Verify role results - 5] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:274 Thursday 16 April 2026 19:31:57 -0400 (0:00:02.013) 0:09:33.406 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node13 TASK [Print out pool information] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Thursday 16 April 2026 19:31:58 -0400 (0:00:01.194) 0:09:34.601 ******** ok: [managed-node13] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Thursday 16 April 2026 19:31:59 -0400 (0:00:00.430) 0:09:35.031 ******** skipping: [managed-node13] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Thursday 16 April 2026 19:31:59 -0400 (0:00:00.238) 0:09:35.269 ******** ok: [managed-node13] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fb6c01b6-de24-4b5b-a70c-3231ceae3fbd" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Thursday 16 April 2026 19:32:00 -0400 (0:00:01.306) 0:09:36.576 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003021", "end": "2026-04-16 19:32:01.663704", "rc": 0, "start": "2026-04-16 19:32:01.660683" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Apr 14 06:59:53 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fb6c01b6-de24-4b5b-a70c-3231ceae3fbd / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Thursday 16 April 2026 19:32:01 -0400 (0:00:01.297) 0:09:37.873 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002868", "end": "2026-04-16 19:32:02.930690", "failed_when_result": false, "rc": 0, "start": "2026-04-16 19:32:02.927822" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Thursday 16 April 2026 19:32:03 -0400 (0:00:01.261) 0:09:39.171 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node13 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Thursday 16 April 2026 19:32:03 -0400 (0:00:00.538) 0:09:39.709 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Thursday 16 April 2026 19:32:03 -0400 (0:00:00.194) 0:09:39.904 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Thursday 16 April 2026 19:32:04 -0400 (0:00:00.180) 0:09:40.084 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Thursday 16 April 2026 19:32:04 -0400 (0:00:00.231) 0:09:40.316 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node13 => (item=members) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node13 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Thursday 16 April 2026 19:32:04 -0400 (0:00:00.695) 0:09:41.011 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Thursday 16 April 2026 19:32:05 -0400 (0:00:00.225) 0:09:41.237 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Thursday 16 April 2026 19:32:05 -0400 (0:00:00.154) 0:09:41.392 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Thursday 16 April 2026 19:32:05 -0400 (0:00:00.220) 0:09:41.613 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Thursday 16 April 2026 19:32:05 -0400 (0:00:00.249) 0:09:41.862 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Thursday 16 April 2026 19:32:06 -0400 (0:00:00.215) 0:09:42.078 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Thursday 16 April 2026 19:32:06 -0400 (0:00:00.214) 0:09:42.292 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Thursday 16 April 2026 19:32:06 -0400 (0:00:00.210) 0:09:42.503 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Thursday 16 April 2026 19:32:06 -0400 (0:00:00.282) 0:09:42.785 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Thursday 16 April 2026 19:32:06 -0400 (0:00:00.181) 0:09:42.966 ******** ok: [managed-node13] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:93191): WARNING **: 19:32:07.902: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.5 27 Jan 2026 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.43.82 originally 10.31.43.82 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.43.82 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.43.82 originally 10.31.43.82 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/983a9e969b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.43.82 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Thursday 16 April 2026 19:32:08 -0400 (0:00:01.194) 0:09:44.161 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Thursday 16 April 2026 19:32:08 -0400 (0:00:00.181) 0:09:44.342 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node13 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Thursday 16 April 2026 19:32:08 -0400 (0:00:00.538) 0:09:44.881 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Thursday 16 April 2026 19:32:09 -0400 (0:00:00.149) 0:09:45.031 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Thursday 16 April 2026 19:32:09 -0400 (0:00:00.225) 0:09:45.257 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Thursday 16 April 2026 19:32:09 -0400 (0:00:00.267) 0:09:45.524 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Thursday 16 April 2026 19:32:09 -0400 (0:00:00.220) 0:09:45.745 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Thursday 16 April 2026 19:32:09 -0400 (0:00:00.205) 0:09:45.951 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Thursday 16 April 2026 19:32:10 -0400 (0:00:00.248) 0:09:46.199 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Thursday 16 April 2026 19:32:10 -0400 (0:00:00.169) 0:09:46.369 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Thursday 16 April 2026 19:32:10 -0400 (0:00:00.143) 0:09:46.513 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Thursday 16 April 2026 19:32:10 -0400 (0:00:00.230) 0:09:46.743 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Thursday 16 April 2026 19:32:10 -0400 (0:00:00.232) 0:09:46.976 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Thursday 16 April 2026 19:32:11 -0400 (0:00:00.294) 0:09:47.270 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node13 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Thursday 16 April 2026 19:32:11 -0400 (0:00:00.615) 0:09:47.886 ******** skipping: [managed-node13] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Thursday 16 April 2026 19:32:12 -0400 (0:00:00.259) 0:09:48.146 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node13 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Thursday 16 April 2026 19:32:12 -0400 (0:00:00.778) 0:09:48.925 ******** skipping: [managed-node13] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Thursday 16 April 2026 19:32:14 -0400 (0:00:01.518) 0:09:50.443 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node13 TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Thursday 16 April 2026 19:32:15 -0400 (0:00:00.827) 0:09:51.270 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Thursday 16 April 2026 19:32:15 -0400 (0:00:00.327) 0:09:51.598 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Thursday 16 April 2026 19:32:15 -0400 (0:00:00.225) 0:09:51.823 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Thursday 16 April 2026 19:32:16 -0400 (0:00:00.206) 0:09:52.030 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Thursday 16 April 2026 19:32:16 -0400 (0:00:00.297) 0:09:52.328 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node13 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Thursday 16 April 2026 19:32:17 -0400 (0:00:00.768) 0:09:53.097 ******** skipping: [managed-node13] => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Thursday 16 April 2026 19:32:17 -0400 (0:00:00.372) 0:09:53.469 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node13 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Thursday 16 April 2026 19:32:18 -0400 (0:00:00.877) 0:09:54.347 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Thursday 16 April 2026 19:32:18 -0400 (0:00:00.241) 0:09:54.589 ******** skipping: [managed-node13] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Thursday 16 April 2026 19:32:18 -0400 (0:00:00.243) 0:09:54.832 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Thursday 16 April 2026 19:32:19 -0400 (0:00:00.286) 0:09:55.119 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Thursday 16 April 2026 19:32:19 -0400 (0:00:00.212) 0:09:55.332 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Thursday 16 April 2026 19:32:19 -0400 (0:00:00.210) 0:09:55.542 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Thursday 16 April 2026 19:32:19 -0400 (0:00:00.214) 0:09:55.757 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Thursday 16 April 2026 19:32:19 -0400 (0:00:00.235) 0:09:55.992 ******** ok: [managed-node13] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Thursday 16 April 2026 19:32:20 -0400 (0:00:00.275) 0:09:56.268 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node13 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks2', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/sda1', '_raw_device': '/dev/sda1', '_mount_id': 'UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29', '_kernel_device': '/dev/sda1', '_raw_kernel_device': '/dev/sda1'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Thursday 16 April 2026 19:32:20 -0400 (0:00:00.506) 0:09:56.774 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Thursday 16 April 2026 19:32:21 -0400 (0:00:00.373) 0:09:57.148 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node13 => (item=mount) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node13 => (item=fstab) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node13 => (item=fs) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node13 => (item=device) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node13 => (item=encryption) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node13 => (item=md) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node13 => (item=size) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node13 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Thursday 16 April 2026 19:32:23 -0400 (0:00:02.239) 0:09:59.388 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Thursday 16 April 2026 19:32:23 -0400 (0:00:00.383) 0:09:59.772 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Thursday 16 April 2026 19:32:24 -0400 (0:00:00.433) 0:10:00.205 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Thursday 16 April 2026 19:32:24 -0400 (0:00:00.529) 0:10:00.735 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Thursday 16 April 2026 19:32:25 -0400 (0:00:00.367) 0:10:01.103 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Thursday 16 April 2026 19:32:25 -0400 (0:00:00.466) 0:10:01.569 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Thursday 16 April 2026 19:32:25 -0400 (0:00:00.431) 0:10:02.001 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Thursday 16 April 2026 19:32:26 -0400 (0:00:00.364) 0:10:02.366 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Thursday 16 April 2026 19:32:26 -0400 (0:00:00.261) 0:10:02.646 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Thursday 16 April 2026 19:32:26 -0400 (0:00:00.194) 0:10:02.840 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Thursday 16 April 2026 19:32:27 -0400 (0:00:00.292) 0:10:03.133 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Thursday 16 April 2026 19:32:27 -0400 (0:00:00.213) 0:10:03.346 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Thursday 16 April 2026 19:32:28 -0400 (0:00:00.978) 0:10:04.325 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Thursday 16 April 2026 19:32:28 -0400 (0:00:00.335) 0:10:04.660 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Thursday 16 April 2026 19:32:29 -0400 (0:00:00.458) 0:10:05.122 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Thursday 16 April 2026 19:32:29 -0400 (0:00:00.247) 0:10:05.369 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Thursday 16 April 2026 19:32:29 -0400 (0:00:00.381) 0:10:05.750 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Thursday 16 April 2026 19:32:29 -0400 (0:00:00.227) 0:10:06.011 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Thursday 16 April 2026 19:32:30 -0400 (0:00:00.479) 0:10:06.491 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Thursday 16 April 2026 19:32:30 -0400 (0:00:00.328) 0:10:06.819 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382298.9397748, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1776382298.9397748, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1237, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776382298.9397748, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Thursday 16 April 2026 19:32:32 -0400 (0:00:01.306) 0:10:08.125 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Thursday 16 April 2026 19:32:32 -0400 (0:00:00.496) 0:10:08.622 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Thursday 16 April 2026 19:32:32 -0400 (0:00:00.227) 0:10:08.849 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Thursday 16 April 2026 19:32:33 -0400 (0:00:00.354) 0:10:09.203 ******** ok: [managed-node13] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Thursday 16 April 2026 19:32:33 -0400 (0:00:00.369) 0:10:09.573 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Thursday 16 April 2026 19:32:33 -0400 (0:00:00.216) 0:10:09.790 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Thursday 16 April 2026 19:32:34 -0400 (0:00:00.323) 0:10:10.113 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Thursday 16 April 2026 19:32:34 -0400 (0:00:00.218) 0:10:10.332 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Thursday 16 April 2026 19:32:36 -0400 (0:00:01.836) 0:10:12.168 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Thursday 16 April 2026 19:32:36 -0400 (0:00:00.224) 0:10:12.393 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Thursday 16 April 2026 19:32:36 -0400 (0:00:00.322) 0:10:12.715 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Thursday 16 April 2026 19:32:36 -0400 (0:00:00.303) 0:10:13.018 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Thursday 16 April 2026 19:32:37 -0400 (0:00:00.201) 0:10:13.220 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Thursday 16 April 2026 19:32:37 -0400 (0:00:00.153) 0:10:13.374 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Thursday 16 April 2026 19:32:37 -0400 (0:00:00.246) 0:10:13.621 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Thursday 16 April 2026 19:32:37 -0400 (0:00:00.216) 0:10:13.837 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Thursday 16 April 2026 19:32:38 -0400 (0:00:00.273) 0:10:14.111 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Thursday 16 April 2026 19:32:38 -0400 (0:00:00.402) 0:10:14.550 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Thursday 16 April 2026 19:32:38 -0400 (0:00:00.313) 0:10:14.867 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Thursday 16 April 2026 19:32:39 -0400 (0:00:00.259) 0:10:15.127 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Thursday 16 April 2026 19:32:39 -0400 (0:00:00.238) 0:10:15.378 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Thursday 16 April 2026 19:32:39 -0400 (0:00:00.270) 0:10:15.648 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Thursday 16 April 2026 19:32:39 -0400 (0:00:00.289) 0:10:15.938 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Thursday 16 April 2026 19:32:40 -0400 (0:00:00.198) 0:10:16.136 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Thursday 16 April 2026 19:32:40 -0400 (0:00:00.235) 0:10:16.373 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Thursday 16 April 2026 19:32:40 -0400 (0:00:00.273) 0:10:16.646 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Thursday 16 April 2026 19:32:40 -0400 (0:00:00.191) 0:10:16.868 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Thursday 16 April 2026 19:32:41 -0400 (0:00:00.304) 0:10:17.172 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Thursday 16 April 2026 19:32:41 -0400 (0:00:00.307) 0:10:17.480 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Thursday 16 April 2026 19:32:41 -0400 (0:00:00.287) 0:10:17.768 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Thursday 16 April 2026 19:32:41 -0400 (0:00:00.214) 0:10:17.982 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Thursday 16 April 2026 19:32:42 -0400 (0:00:00.249) 0:10:18.232 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Thursday 16 April 2026 19:32:42 -0400 (0:00:00.243) 0:10:18.475 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Thursday 16 April 2026 19:32:42 -0400 (0:00:00.193) 0:10:18.669 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Thursday 16 April 2026 19:32:42 -0400 (0:00:00.251) 0:10:18.920 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Thursday 16 April 2026 19:32:43 -0400 (0:00:00.259) 0:10:19.180 ******** ok: [managed-node13] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Thursday 16 April 2026 19:32:43 -0400 (0:00:00.251) 0:10:19.432 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Thursday 16 April 2026 19:32:43 -0400 (0:00:00.262) 0:10:19.694 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Thursday 16 April 2026 19:32:43 -0400 (0:00:00.231) 0:10:19.926 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Thursday 16 April 2026 19:32:44 -0400 (0:00:00.260) 0:10:20.187 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Thursday 16 April 2026 19:32:44 -0400 (0:00:00.271) 0:10:20.459 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Thursday 16 April 2026 19:32:44 -0400 (0:00:00.235) 0:10:20.694 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Thursday 16 April 2026 19:32:44 -0400 (0:00:00.321) 0:10:21.015 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Thursday 16 April 2026 19:32:45 -0400 (0:00:00.357) 0:10:21.373 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Thursday 16 April 2026 19:32:45 -0400 (0:00:00.351) 0:10:21.725 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Thursday 16 April 2026 19:32:45 -0400 (0:00:00.250) 0:10:21.975 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Thursday 16 April 2026 19:32:46 -0400 (0:00:00.370) 0:10:22.346 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Thursday 16 April 2026 19:32:46 -0400 (0:00:00.364) 0:10:22.710 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Thursday 16 April 2026 19:32:47 -0400 (0:00:00.326) 0:10:23.037 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Thursday 16 April 2026 19:32:47 -0400 (0:00:00.346) 0:10:23.383 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Thursday 16 April 2026 19:32:47 -0400 (0:00:00.361) 0:10:23.745 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Thursday 16 April 2026 19:32:48 -0400 (0:00:00.360) 0:10:24.105 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Thursday 16 April 2026 19:32:48 -0400 (0:00:00.425) 0:10:24.531 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Thursday 16 April 2026 19:32:48 -0400 (0:00:00.401) 0:10:24.932 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Thursday 16 April 2026 19:32:49 -0400 (0:00:00.317) 0:10:25.249 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Thursday 16 April 2026 19:32:49 -0400 (0:00:00.362) 0:10:25.612 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Thursday 16 April 2026 19:32:49 -0400 (0:00:00.311) 0:10:25.924 ******** ok: [managed-node13] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Thursday 16 April 2026 19:32:50 -0400 (0:00:00.328) 0:10:26.253 ******** ok: [managed-node13] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Thursday 16 April 2026 19:32:50 -0400 (0:00:00.278) 0:10:26.532 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Thursday 16 April 2026 19:32:50 -0400 (0:00:00.275) 0:10:26.808 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Thursday 16 April 2026 19:32:50 -0400 (0:00:00.214) 0:10:27.022 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Thursday 16 April 2026 19:32:51 -0400 (0:00:00.269) 0:10:27.292 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Thursday 16 April 2026 19:32:51 -0400 (0:00:00.277) 0:10:27.570 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Thursday 16 April 2026 19:32:51 -0400 (0:00:00.252) 0:10:27.822 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Thursday 16 April 2026 19:32:52 -0400 (0:00:00.349) 0:10:28.172 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Thursday 16 April 2026 19:32:52 -0400 (0:00:00.244) 0:10:28.416 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Thursday 16 April 2026 19:32:52 -0400 (0:00:00.260) 0:10:28.677 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Thursday 16 April 2026 19:32:52 -0400 (0:00:00.241) 0:10:28.919 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Thursday 16 April 2026 19:32:53 -0400 (0:00:00.213) 0:10:29.132 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Thursday 16 April 2026 19:32:53 -0400 (0:00:00.193) 0:10:29.326 ******** changed: [managed-node13] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 4] ****************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:280 Thursday 16 April 2026 19:32:54 -0400 (0:00:01.332) 0:10:30.659 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node13 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Thursday 16 April 2026 19:32:55 -0400 (0:00:01.039) 0:10:31.698 ******** ok: [managed-node13] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Thursday 16 April 2026 19:32:55 -0400 (0:00:00.254) 0:10:31.952 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node13 TASK [Clear facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Thursday 16 April 2026 19:32:56 -0400 (0:00:00.304) 0:10:32.256 ******** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Thursday 16 April 2026 19:32:56 -0400 (0:00:00.033) 0:10:32.290 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Thursday 16 April 2026 19:32:56 -0400 (0:00:00.192) 0:10:32.482 ******** included: fedora.linux_system_roles.storage for managed-node13 TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 16 April 2026 19:32:56 -0400 (0:00:00.465) 0:10:32.947 ******** ok: [managed-node13] => { "ansible_facts": { "discovered_interpreter_python": "/usr/bin/python3.9" }, "changed": false, "message": "Message written to syslog" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:7 Thursday 16 April 2026 19:32:58 -0400 (0:00:01.386) 0:10:34.334 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 16 April 2026 19:32:58 -0400 (0:00:00.298) 0:10:34.632 ******** ok: [managed-node13] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 16 April 2026 19:33:00 -0400 (0:00:01.734) 0:10:36.367 ******** skipping: [managed-node13] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Thursday 16 April 2026 19:33:01 -0400 (0:00:00.815) 0:10:37.183 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Thursday 16 April 2026 19:33:01 -0400 (0:00:00.371) 0:10:37.554 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:10 Thursday 16 April 2026 19:33:01 -0400 (0:00:00.364) 0:10:37.918 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:14 Thursday 16 April 2026 19:33:02 -0400 (0:00:00.238) 0:10:38.157 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:18 Thursday 16 April 2026 19:33:02 -0400 (0:00:00.262) 0:10:38.420 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Thursday 16 April 2026 19:33:03 -0400 (0:00:00.810) 0:10:39.230 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Thursday 16 April 2026 19:33:03 -0400 (0:00:00.300) 0:10:39.531 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Thursday 16 April 2026 19:33:03 -0400 (0:00:00.262) 0:10:39.794 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Thursday 16 April 2026 19:33:05 -0400 (0:00:02.035) 0:10:41.829 ******** ok: [managed-node13] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Thursday 16 April 2026 19:33:06 -0400 (0:00:00.390) 0:10:42.219 ******** ok: [managed-node13] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Thursday 16 April 2026 19:33:06 -0400 (0:00:00.371) 0:10:42.591 ******** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Thursday 16 April 2026 19:33:08 -0400 (0:00:02.245) 0:10:44.836 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 16 April 2026 19:33:09 -0400 (0:00:00.558) 0:10:45.395 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 16 April 2026 19:33:09 -0400 (0:00:00.221) 0:10:45.616 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Thursday 16 April 2026 19:33:09 -0400 (0:00:00.215) 0:10:45.832 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Thursday 16 April 2026 19:33:10 -0400 (0:00:00.201) 0:10:46.033 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Thursday 16 April 2026 19:33:12 -0400 (0:00:02.007) 0:10:48.041 ******** ok: [managed-node13] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service": { "name": "systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Thursday 16 April 2026 19:33:18 -0400 (0:00:06.563) 0:10:54.605 ******** changed: [managed-node13] => (item=systemd-cryptsetup@luks\x2d2faf63cf\x2da8c6\x2d4dd2\x2db395\x2dd9542a5f7a02.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service", "name": "systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target dev-sda1.device \"system-systemd\\\\x2dcryptsetup.slice\" systemd-journald.socket systemd-udevd-kernel.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target \"blockdev@dev-mapper-luks\\\\x2d2faf63cf\\\\x2da8c6\\\\x2d4dd2\\\\x2db395\\\\x2dd9542a5f7a02.target\" umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02 /dev/sda1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13683", "LimitNPROCSoft": "13683", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13683", "LimitSIGPENDINGSoft": "13683", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d2faf63cf\\\\x2da8c6\\\\x2d4dd2\\\\x2db395\\\\x2dd9542a5f7a02.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2026-04-16 19:31:10 EDT", "StateChangeTimestampMonotonic": "2080726688", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21893", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d2faf63cf\\\\x2da8c6\\\\x2d4dd2\\\\x2db395\\\\x2dd9542a5f7a02.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Thursday 16 April 2026 19:33:20 -0400 (0:00:01.751) 0:10:56.356 ******** fatal: [managed-node13]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Thursday 16 April 2026 19:33:22 -0400 (0:00:02.305) 0:10:58.662 ******** fatal: [managed-node13]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Thursday 16 April 2026 19:33:22 -0400 (0:00:00.329) 0:10:58.992 ******** changed: [managed-node13] => (item=systemd-cryptsetup@luks\x2d2faf63cf\x2da8c6\x2d4dd2\x2db395\x2dd9542a5f7a02.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service", "name": "systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454168576", "LimitMEMLOCKSoft": "454168576", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13683", "LimitNPROCSoft": "13683", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13683", "LimitSIGPENDINGSoft": "13683", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d2faf63cf\\x2da8c6\\x2d4dd2\\x2db395\\x2dd9542a5f7a02.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d2faf63cf\\\\x2da8c6\\\\x2d4dd2\\\\x2db395\\\\x2dd9542a5f7a02.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21893", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Thursday 16 April 2026 19:33:24 -0400 (0:00:01.521) 0:11:00.514 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Thursday 16 April 2026 19:33:24 -0400 (0:00:00.209) 0:11:00.724 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Thursday 16 April 2026 19:33:25 -0400 (0:00:00.355) 0:11:01.079 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Thursday 16 April 2026 19:33:25 -0400 (0:00:00.249) 0:11:01.329 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382374.4487925, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776382374.4487925, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776382374.4487925, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3493939599", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Thursday 16 April 2026 19:33:26 -0400 (0:00:01.116) 0:11:02.445 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:306 Thursday 16 April 2026 19:33:26 -0400 (0:00:00.337) 0:11:02.783 ******** ok: [managed-node13] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_test17bktr_slukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:313 Thursday 16 April 2026 19:33:29 -0400 (0:00:02.272) 0:11:05.055 ******** ok: [managed-node13] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_test17bktr_slukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1776382409.318843-144292-182253020750035/.source", "state": "file", "uid": 0 } TASK [Add encryption to the volume - 2] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:320 Thursday 16 April 2026 19:33:32 -0400 (0:00:03.731) 0:11:08.788 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node13 TASK [Clear facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Thursday 16 April 2026 19:33:33 -0400 (0:00:00.264) 0:11:09.052 ******** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Thursday 16 April 2026 19:33:33 -0400 (0:00:00.002) 0:11:09.054 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Thursday 16 April 2026 19:33:33 -0400 (0:00:00.276) 0:11:09.331 ******** included: fedora.linux_system_roles.storage for managed-node13 TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 16 April 2026 19:33:33 -0400 (0:00:00.458) 0:11:09.789 ******** ok: [managed-node13] => { "ansible_facts": { "discovered_interpreter_python": "/usr/bin/python3.9" }, "changed": false, "message": "Message written to syslog" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:7 Thursday 16 April 2026 19:33:34 -0400 (0:00:01.234) 0:11:11.024 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 16 April 2026 19:33:35 -0400 (0:00:00.279) 0:11:11.303 ******** ok: [managed-node13] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 16 April 2026 19:33:36 -0400 (0:00:01.619) 0:11:12.922 ******** skipping: [managed-node13] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Thursday 16 April 2026 19:33:37 -0400 (0:00:00.787) 0:11:13.710 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Thursday 16 April 2026 19:33:38 -0400 (0:00:00.363) 0:11:14.073 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:10 Thursday 16 April 2026 19:33:38 -0400 (0:00:00.222) 0:11:14.295 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:14 Thursday 16 April 2026 19:33:38 -0400 (0:00:00.178) 0:11:14.474 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:18 Thursday 16 April 2026 19:33:38 -0400 (0:00:00.150) 0:11:14.624 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Thursday 16 April 2026 19:33:39 -0400 (0:00:00.794) 0:11:15.418 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Thursday 16 April 2026 19:33:39 -0400 (0:00:00.187) 0:11:15.606 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Thursday 16 April 2026 19:33:39 -0400 (0:00:00.227) 0:11:15.833 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Thursday 16 April 2026 19:33:41 -0400 (0:00:01.761) 0:11:17.595 ******** ok: [managed-node13] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_test17bktr_slukskey", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Thursday 16 April 2026 19:33:41 -0400 (0:00:00.328) 0:11:17.923 ******** ok: [managed-node13] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Thursday 16 April 2026 19:33:42 -0400 (0:00:00.365) 0:11:18.288 ******** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Thursday 16 April 2026 19:33:44 -0400 (0:00:02.014) 0:11:20.302 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 16 April 2026 19:33:44 -0400 (0:00:00.372) 0:11:20.675 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 16 April 2026 19:33:44 -0400 (0:00:00.193) 0:11:20.868 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Thursday 16 April 2026 19:33:45 -0400 (0:00:00.299) 0:11:21.167 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Thursday 16 April 2026 19:33:45 -0400 (0:00:00.164) 0:11:21.332 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Thursday 16 April 2026 19:33:47 -0400 (0:00:01.764) 0:11:23.097 ******** ok: [managed-node13] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Thursday 16 April 2026 19:33:49 -0400 (0:00:02.792) 0:11:25.889 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Thursday 16 April 2026 19:33:50 -0400 (0:00:00.565) 0:11:26.455 ******** changed: [managed-node13] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "password": "/tmp/storage_test17bktr_slukskey", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test17bktr_slukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Thursday 16 April 2026 19:34:02 -0400 (0:00:11.937) 0:11:38.392 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Thursday 16 April 2026 19:34:02 -0400 (0:00:00.277) 0:11:38.669 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382310.3137774, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "3e88b683444eb90ffa9d3b8774aa392b47768a9a", "ctime": 1776382310.3107774, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 218104009, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776382310.3107774, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1478, "uid": 0, "version": "292099060", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Thursday 16 April 2026 19:34:03 -0400 (0:00:01.183) 0:11:39.852 ******** ok: [managed-node13] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Thursday 16 April 2026 19:34:05 -0400 (0:00:01.308) 0:11:41.161 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Thursday 16 April 2026 19:34:05 -0400 (0:00:00.523) 0:11:41.684 ******** ok: [managed-node13] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "password": "/tmp/storage_test17bktr_slukskey", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test17bktr_slukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 16 April 2026 19:34:05 -0400 (0:00:00.315) 0:11:42.000 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test17bktr_slukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Thursday 16 April 2026 19:34:06 -0400 (0:00:00.334) 0:11:42.334 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Thursday 16 April 2026 19:34:06 -0400 (0:00:00.330) 0:11:42.664 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node13] => (item={'src': 'UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=f2d391a2-eab0-4f7c-a3f7-e4a768f4fe29" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Thursday 16 April 2026 19:34:08 -0400 (0:00:01.535) 0:11:44.199 ******** ok: [managed-node13] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Thursday 16 April 2026 19:34:09 -0400 (0:00:01.583) 0:11:45.783 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node13] => (item={'src': '/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Thursday 16 April 2026 19:34:11 -0400 (0:00:01.529) 0:11:47.313 ******** skipping: [managed-node13] => (item={'src': '/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Thursday 16 April 2026 19:34:11 -0400 (0:00:00.421) 0:11:47.734 ******** ok: [managed-node13] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Thursday 16 April 2026 19:34:13 -0400 (0:00:01.681) 0:11:49.416 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382322.9297805, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776382315.1457787, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 92274895, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1776382315.1457875, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "3109490747", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Thursday 16 April 2026 19:34:14 -0400 (0:00:01.037) 0:11:50.454 ******** changed: [managed-node13] => (item={'backing_device': '/dev/sda1', 'name': 'luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb', 'password': '/tmp/storage_test17bktr_slukskey', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "password": "/tmp/storage_test17bktr_slukskey", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Thursday 16 April 2026 19:34:15 -0400 (0:00:01.394) 0:11:51.849 ******** ok: [managed-node13] TASK [Verify role results - 6] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:336 Thursday 16 April 2026 19:34:17 -0400 (0:00:02.089) 0:11:53.938 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node13 TASK [Print out pool information] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Thursday 16 April 2026 19:34:18 -0400 (0:00:00.394) 0:11:54.332 ******** ok: [managed-node13] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test17bktr_slukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Thursday 16 April 2026 19:34:18 -0400 (0:00:00.361) 0:11:54.694 ******** skipping: [managed-node13] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Thursday 16 April 2026 19:34:18 -0400 (0:00:00.210) 0:11:54.904 ******** ok: [managed-node13] => { "changed": false, "info": { "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "size": "4G", "type": "crypt", "uuid": "3d332855-c3d3-4a7b-be74-ad2c8572f43c" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "4G", "type": "partition", "uuid": "691f6e5f-6f09-4246-a8c2-4a5b014208bb" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fb6c01b6-de24-4b5b-a70c-3231ceae3fbd" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Thursday 16 April 2026 19:34:20 -0400 (0:00:01.177) 0:11:56.081 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003051", "end": "2026-04-16 19:34:21.162845", "rc": 0, "start": "2026-04-16 19:34:21.159794" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Apr 14 06:59:53 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fb6c01b6-de24-4b5b-a70c-3231ceae3fbd / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Thursday 16 April 2026 19:34:21 -0400 (0:00:01.326) 0:11:57.408 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003585", "end": "2026-04-16 19:34:22.499942", "failed_when_result": false, "rc": 0, "start": "2026-04-16 19:34:22.496357" } STDOUT: luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb /dev/sda1 /tmp/storage_test17bktr_slukskey TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Thursday 16 April 2026 19:34:22 -0400 (0:00:01.300) 0:11:58.708 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node13 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'partition', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_test17bktr_slukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Thursday 16 April 2026 19:34:23 -0400 (0:00:00.462) 0:11:59.171 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Thursday 16 April 2026 19:34:23 -0400 (0:00:00.197) 0:11:59.369 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Thursday 16 April 2026 19:34:23 -0400 (0:00:00.234) 0:11:59.603 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.state == 'present'", "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Thursday 16 April 2026 19:34:23 -0400 (0:00:00.269) 0:11:59.872 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node13 => (item=members) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node13 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Thursday 16 April 2026 19:34:24 -0400 (0:00:00.706) 0:12:00.579 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Thursday 16 April 2026 19:34:24 -0400 (0:00:00.237) 0:12:00.816 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Thursday 16 April 2026 19:34:25 -0400 (0:00:00.220) 0:12:01.037 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Thursday 16 April 2026 19:34:25 -0400 (0:00:00.211) 0:12:01.249 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Thursday 16 April 2026 19:34:25 -0400 (0:00:00.231) 0:12:01.480 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Thursday 16 April 2026 19:34:25 -0400 (0:00:00.270) 0:12:01.750 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Thursday 16 April 2026 19:34:25 -0400 (0:00:00.226) 0:12:01.977 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and not storage_test_pool.encryption", "skip_reason": "Conditional result was False" } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Thursday 16 April 2026 19:34:26 -0400 (0:00:00.151) 0:12:02.129 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Thursday 16 April 2026 19:34:26 -0400 (0:00:00.234) 0:12:02.363 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Thursday 16 April 2026 19:34:26 -0400 (0:00:00.161) 0:12:02.525 ******** ok: [managed-node13] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:99559): WARNING **: 19:34:27.392: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.5 27 Jan 2026 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.43.82 originally 10.31.43.82 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.43.82 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.43.82 originally 10.31.43.82 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/983a9e969b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.43.82 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Thursday 16 April 2026 19:34:27 -0400 (0:00:01.116) 0:12:03.641 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Thursday 16 April 2026 19:34:27 -0400 (0:00:00.212) 0:12:03.853 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node13 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Thursday 16 April 2026 19:34:28 -0400 (0:00:00.764) 0:12:04.618 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Thursday 16 April 2026 19:34:28 -0400 (0:00:00.217) 0:12:04.836 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Thursday 16 April 2026 19:34:29 -0400 (0:00:00.222) 0:12:05.076 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Thursday 16 April 2026 19:34:29 -0400 (0:00:00.266) 0:12:05.343 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Thursday 16 April 2026 19:34:29 -0400 (0:00:00.229) 0:12:05.572 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Thursday 16 April 2026 19:34:29 -0400 (0:00:00.269) 0:12:05.841 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Thursday 16 April 2026 19:34:30 -0400 (0:00:00.186) 0:12:06.028 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Thursday 16 April 2026 19:34:30 -0400 (0:00:00.183) 0:12:06.211 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Thursday 16 April 2026 19:34:30 -0400 (0:00:00.229) 0:12:06.441 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Thursday 16 April 2026 19:34:30 -0400 (0:00:00.172) 0:12:06.614 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Thursday 16 April 2026 19:34:30 -0400 (0:00:00.306) 0:12:06.920 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Thursday 16 April 2026 19:34:31 -0400 (0:00:00.294) 0:12:07.215 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node13 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Thursday 16 April 2026 19:34:31 -0400 (0:00:00.628) 0:12:07.843 ******** skipping: [managed-node13] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_test17bktr_slukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test17bktr_slukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Thursday 16 April 2026 19:34:32 -0400 (0:00:00.378) 0:12:08.221 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node13 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Thursday 16 April 2026 19:34:32 -0400 (0:00:00.589) 0:12:08.811 ******** skipping: [managed-node13] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_test17bktr_slukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test17bktr_slukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [Check member encryption] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Thursday 16 April 2026 19:34:33 -0400 (0:00:00.298) 0:12:09.109 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node13 TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Thursday 16 April 2026 19:34:33 -0400 (0:00:00.693) 0:12:09.803 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Thursday 16 April 2026 19:34:34 -0400 (0:00:00.371) 0:12:10.175 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Thursday 16 April 2026 19:34:34 -0400 (0:00:00.214) 0:12:10.389 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Thursday 16 April 2026 19:34:34 -0400 (0:00:00.221) 0:12:10.611 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Thursday 16 April 2026 19:34:34 -0400 (0:00:00.248) 0:12:10.860 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node13 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Thursday 16 April 2026 19:34:35 -0400 (0:00:00.794) 0:12:11.654 ******** skipping: [managed-node13] => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_test17bktr_slukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "false_condition": "storage_test_pool.type == 'lvm'", "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "/tmp/storage_test17bktr_slukskey", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [Check Stratis] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Thursday 16 April 2026 19:34:36 -0400 (0:00:00.398) 0:12:12.053 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node13 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Thursday 16 April 2026 19:34:36 -0400 (0:00:00.871) 0:12:12.925 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Thursday 16 April 2026 19:34:37 -0400 (0:00:00.268) 0:12:13.193 ******** skipping: [managed-node13] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Thursday 16 April 2026 19:34:37 -0400 (0:00:00.259) 0:12:13.453 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Thursday 16 April 2026 19:34:37 -0400 (0:00:00.193) 0:12:13.646 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Thursday 16 April 2026 19:34:37 -0400 (0:00:00.205) 0:12:13.852 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Thursday 16 April 2026 19:34:38 -0400 (0:00:00.283) 0:12:14.135 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Thursday 16 April 2026 19:34:38 -0400 (0:00:00.216) 0:12:14.351 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Thursday 16 April 2026 19:34:38 -0400 (0:00:00.220) 0:12:14.572 ******** ok: [managed-node13] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Thursday 16 April 2026 19:34:38 -0400 (0:00:00.321) 0:12:14.893 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node13 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': '/tmp/storage_test17bktr_slukskey', 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'partition', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb', '_raw_device': '/dev/sda1', '_mount_id': '/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/sda1'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Thursday 16 April 2026 19:34:39 -0400 (0:00:00.321) 0:12:15.214 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Thursday 16 April 2026 19:34:39 -0400 (0:00:00.350) 0:12:15.565 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node13 => (item=mount) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node13 => (item=fstab) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node13 => (item=fs) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node13 => (item=device) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node13 => (item=encryption) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node13 => (item=md) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node13 => (item=size) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node13 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Thursday 16 April 2026 19:34:41 -0400 (0:00:01.827) 0:12:17.392 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Thursday 16 April 2026 19:34:41 -0400 (0:00:00.369) 0:12:17.761 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Thursday 16 April 2026 19:34:42 -0400 (0:00:00.364) 0:12:18.126 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Thursday 16 April 2026 19:34:42 -0400 (0:00:00.451) 0:12:18.579 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Thursday 16 April 2026 19:34:44 -0400 (0:00:01.484) 0:12:20.064 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Thursday 16 April 2026 19:34:44 -0400 (0:00:00.285) 0:12:20.350 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Thursday 16 April 2026 19:34:44 -0400 (0:00:00.283) 0:12:20.633 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Thursday 16 April 2026 19:34:45 -0400 (0:00:00.392) 0:12:21.025 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Thursday 16 April 2026 19:34:45 -0400 (0:00:00.236) 0:12:21.262 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Thursday 16 April 2026 19:34:45 -0400 (0:00:00.227) 0:12:21.489 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Thursday 16 April 2026 19:34:45 -0400 (0:00:00.206) 0:12:21.696 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Thursday 16 April 2026 19:34:45 -0400 (0:00:00.266) 0:12:21.962 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Thursday 16 April 2026 19:34:46 -0400 (0:00:00.745) 0:12:22.730 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Thursday 16 April 2026 19:34:47 -0400 (0:00:00.336) 0:12:23.067 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Thursday 16 April 2026 19:34:47 -0400 (0:00:00.248) 0:12:23.316 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Thursday 16 April 2026 19:34:47 -0400 (0:00:00.213) 0:12:23.529 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Thursday 16 April 2026 19:34:47 -0400 (0:00:00.315) 0:12:23.845 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Thursday 16 April 2026 19:34:48 -0400 (0:00:00.234) 0:12:24.079 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Thursday 16 April 2026 19:34:48 -0400 (0:00:00.357) 0:12:24.436 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Thursday 16 April 2026 19:34:48 -0400 (0:00:00.304) 0:12:24.741 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382441.8568082, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1776382441.8568082, "dev": 6, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1641, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776382441.8568082, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Thursday 16 April 2026 19:34:49 -0400 (0:00:01.273) 0:12:26.014 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Thursday 16 April 2026 19:34:50 -0400 (0:00:00.361) 0:12:26.376 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Thursday 16 April 2026 19:34:50 -0400 (0:00:00.247) 0:12:26.667 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Thursday 16 April 2026 19:34:50 -0400 (0:00:00.270) 0:12:26.938 ******** ok: [managed-node13] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Thursday 16 April 2026 19:34:51 -0400 (0:00:00.269) 0:12:27.207 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Thursday 16 April 2026 19:34:51 -0400 (0:00:00.180) 0:12:27.388 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Thursday 16 April 2026 19:34:51 -0400 (0:00:00.345) 0:12:27.733 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382442.1018083, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1776382442.1018083, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1681, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776382442.1018083, "nlink": 1, "path": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Thursday 16 April 2026 19:34:53 -0400 (0:00:01.420) 0:12:29.153 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Thursday 16 April 2026 19:34:55 -0400 (0:00:01.975) 0:12:31.129 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.006904", "end": "2026-04-16 19:34:56.112485", "rc": 0, "start": "2026-04-16 19:34:56.105581" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 691f6e5f-6f09-4246-a8c2-4a5b014208bb Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 689848 Threads: 2 Salt: 1a c6 0f 71 b1 7a 4b c9 d3 b6 5e b8 c2 95 59 a3 dc 76 65 f5 9a 01 09 d3 26 51 26 74 49 2c df c1 AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 131863 Salt: 0d 99 71 2f 2f b4 fb 49 c6 d9 f8 0e 86 77 0b 0e 05 9d a3 f0 6b ff a0 2d fc 4c 68 88 66 7b c1 da Digest: 50 ea a5 e1 73 4e c5 43 ee 93 51 9b 29 28 99 89 4d c7 d8 5a 7a be 0e ed d3 e5 60 c4 6a 79 46 40 TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Thursday 16 April 2026 19:34:56 -0400 (0:00:01.161) 0:12:32.291 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Thursday 16 April 2026 19:34:56 -0400 (0:00:00.276) 0:12:32.568 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Thursday 16 April 2026 19:34:56 -0400 (0:00:00.372) 0:12:32.940 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Thursday 16 April 2026 19:34:57 -0400 (0:00:00.387) 0:12:33.328 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Thursday 16 April 2026 19:34:57 -0400 (0:00:00.445) 0:12:33.773 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.encryption_luks_version is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Thursday 16 April 2026 19:34:58 -0400 (0:00:00.396) 0:12:34.169 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Thursday 16 April 2026 19:34:58 -0400 (0:00:00.346) 0:12:34.516 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Thursday 16 April 2026 19:34:58 -0400 (0:00:00.374) 0:12:34.891 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb /dev/sda1 /tmp/storage_test17bktr_slukskey" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "/tmp/storage_test17bktr_slukskey" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Thursday 16 April 2026 19:34:59 -0400 (0:00:00.575) 0:12:35.467 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Thursday 16 April 2026 19:34:59 -0400 (0:00:00.405) 0:12:35.873 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Thursday 16 April 2026 19:35:00 -0400 (0:00:00.422) 0:12:36.295 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Thursday 16 April 2026 19:35:00 -0400 (0:00:00.523) 0:12:36.819 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Thursday 16 April 2026 19:35:01 -0400 (0:00:00.311) 0:12:37.130 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Thursday 16 April 2026 19:35:01 -0400 (0:00:00.234) 0:12:37.365 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Thursday 16 April 2026 19:35:01 -0400 (0:00:00.262) 0:12:37.627 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Thursday 16 April 2026 19:35:01 -0400 (0:00:00.264) 0:12:37.892 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Thursday 16 April 2026 19:35:02 -0400 (0:00:00.206) 0:12:38.099 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Thursday 16 April 2026 19:35:02 -0400 (0:00:00.207) 0:12:38.306 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Thursday 16 April 2026 19:35:02 -0400 (0:00:00.186) 0:12:38.493 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Thursday 16 April 2026 19:35:02 -0400 (0:00:00.198) 0:12:38.711 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Thursday 16 April 2026 19:35:02 -0400 (0:00:00.193) 0:12:38.905 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Thursday 16 April 2026 19:35:03 -0400 (0:00:00.212) 0:12:39.117 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Thursday 16 April 2026 19:35:03 -0400 (0:00:00.282) 0:12:39.400 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Thursday 16 April 2026 19:35:03 -0400 (0:00:00.280) 0:12:39.680 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Thursday 16 April 2026 19:35:03 -0400 (0:00:00.215) 0:12:39.895 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Thursday 16 April 2026 19:35:04 -0400 (0:00:00.219) 0:12:40.115 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Thursday 16 April 2026 19:35:04 -0400 (0:00:00.271) 0:12:40.387 ******** ok: [managed-node13] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Thursday 16 April 2026 19:35:04 -0400 (0:00:00.258) 0:12:40.646 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Thursday 16 April 2026 19:35:04 -0400 (0:00:00.252) 0:12:40.898 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Thursday 16 April 2026 19:35:05 -0400 (0:00:00.200) 0:12:41.099 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Thursday 16 April 2026 19:35:05 -0400 (0:00:00.250) 0:12:41.349 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.type == \"lvm\"" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Thursday 16 April 2026 19:35:05 -0400 (0:00:00.363) 0:12:41.712 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Thursday 16 April 2026 19:35:05 -0400 (0:00:00.235) 0:12:41.948 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Thursday 16 April 2026 19:35:06 -0400 (0:00:00.192) 0:12:42.140 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Thursday 16 April 2026 19:35:06 -0400 (0:00:00.288) 0:12:42.428 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Thursday 16 April 2026 19:35:06 -0400 (0:00:00.443) 0:12:42.872 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Thursday 16 April 2026 19:35:07 -0400 (0:00:00.401) 0:12:43.274 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Thursday 16 April 2026 19:35:07 -0400 (0:00:00.252) 0:12:43.526 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Thursday 16 April 2026 19:35:07 -0400 (0:00:00.273) 0:12:43.800 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Thursday 16 April 2026 19:35:08 -0400 (0:00:00.294) 0:12:44.094 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Thursday 16 April 2026 19:35:08 -0400 (0:00:00.318) 0:12:44.412 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Thursday 16 April 2026 19:35:08 -0400 (0:00:00.294) 0:12:44.707 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Thursday 16 April 2026 19:35:09 -0400 (0:00:00.430) 0:12:45.138 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Thursday 16 April 2026 19:35:09 -0400 (0:00:00.337) 0:12:45.475 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Thursday 16 April 2026 19:35:09 -0400 (0:00:00.278) 0:12:45.753 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Thursday 16 April 2026 19:35:10 -0400 (0:00:00.342) 0:12:46.096 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Thursday 16 April 2026 19:35:10 -0400 (0:00:00.285) 0:12:46.381 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Thursday 16 April 2026 19:35:10 -0400 (0:00:00.419) 0:12:46.801 ******** ok: [managed-node13] => { "storage_test_actual_size": { "changed": false, "false_condition": "storage_test_volume.type not in ['partition', 'disk']", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Thursday 16 April 2026 19:35:11 -0400 (0:00:00.269) 0:12:47.070 ******** ok: [managed-node13] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Thursday 16 April 2026 19:35:11 -0400 (0:00:00.351) 0:12:47.423 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"lvm\"", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Thursday 16 April 2026 19:35:11 -0400 (0:00:00.281) 0:12:47.704 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Thursday 16 April 2026 19:35:11 -0400 (0:00:00.288) 0:12:47.993 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Thursday 16 April 2026 19:35:12 -0400 (0:00:00.256) 0:12:48.249 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Thursday 16 April 2026 19:35:12 -0400 (0:00:00.294) 0:12:48.548 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Thursday 16 April 2026 19:35:12 -0400 (0:00:00.238) 0:12:48.786 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Thursday 16 April 2026 19:35:12 -0400 (0:00:00.236) 0:12:49.023 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Thursday 16 April 2026 19:35:13 -0400 (0:00:00.177) 0:12:49.200 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Thursday 16 April 2026 19:35:13 -0400 (0:00:00.216) 0:12:49.417 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Thursday 16 April 2026 19:35:13 -0400 (0:00:00.327) 0:12:49.744 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Thursday 16 April 2026 19:35:14 -0400 (0:00:00.287) 0:12:50.032 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:342 Thursday 16 April 2026 19:35:14 -0400 (0:00:00.227) 0:12:50.260 ******** ok: [managed-node13] => { "changed": false, "path": "/tmp/storage_test17bktr_slukskey", "state": "absent" } TASK [Test for correct handling of new encrypted volume w/ no key - 3] ********* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:352 Thursday 16 April 2026 19:35:15 -0400 (0:00:00.998) 0:12:51.258 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node13 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Thursday 16 April 2026 19:35:15 -0400 (0:00:00.413) 0:12:51.672 ******** ok: [managed-node13] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Thursday 16 April 2026 19:35:15 -0400 (0:00:00.318) 0:12:51.990 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node13 TASK [Clear facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Thursday 16 April 2026 19:35:16 -0400 (0:00:00.410) 0:12:52.401 ******** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Thursday 16 April 2026 19:35:16 -0400 (0:00:00.023) 0:12:52.425 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Thursday 16 April 2026 19:35:16 -0400 (0:00:00.238) 0:12:52.663 ******** included: fedora.linux_system_roles.storage for managed-node13 TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 16 April 2026 19:35:17 -0400 (0:00:00.528) 0:12:53.191 ******** ok: [managed-node13] => { "ansible_facts": { "discovered_interpreter_python": "/usr/bin/python3.9" }, "changed": false, "message": "Message written to syslog" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:7 Thursday 16 April 2026 19:35:18 -0400 (0:00:01.606) 0:12:54.797 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 16 April 2026 19:35:19 -0400 (0:00:00.302) 0:12:55.100 ******** ok: [managed-node13] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 16 April 2026 19:35:21 -0400 (0:00:01.963) 0:12:57.063 ******** skipping: [managed-node13] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Thursday 16 April 2026 19:35:21 -0400 (0:00:00.848) 0:12:57.911 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Thursday 16 April 2026 19:35:22 -0400 (0:00:00.307) 0:12:58.219 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:10 Thursday 16 April 2026 19:35:22 -0400 (0:00:00.396) 0:12:58.616 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:14 Thursday 16 April 2026 19:35:22 -0400 (0:00:00.251) 0:12:58.867 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:18 Thursday 16 April 2026 19:35:23 -0400 (0:00:00.239) 0:12:59.107 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Thursday 16 April 2026 19:35:23 -0400 (0:00:00.798) 0:12:59.905 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Thursday 16 April 2026 19:35:24 -0400 (0:00:00.256) 0:13:00.162 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Thursday 16 April 2026 19:35:24 -0400 (0:00:00.310) 0:13:00.472 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Thursday 16 April 2026 19:35:26 -0400 (0:00:01.928) 0:13:02.401 ******** ok: [managed-node13] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Thursday 16 April 2026 19:35:26 -0400 (0:00:00.227) 0:13:02.634 ******** ok: [managed-node13] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Thursday 16 April 2026 19:35:26 -0400 (0:00:00.290) 0:13:02.924 ******** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Thursday 16 April 2026 19:35:29 -0400 (0:00:02.294) 0:13:05.219 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 16 April 2026 19:35:29 -0400 (0:00:00.570) 0:13:05.789 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 16 April 2026 19:35:29 -0400 (0:00:00.222) 0:13:06.011 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Thursday 16 April 2026 19:35:30 -0400 (0:00:00.280) 0:13:06.292 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Thursday 16 April 2026 19:35:30 -0400 (0:00:00.241) 0:13:06.533 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Thursday 16 April 2026 19:35:32 -0400 (0:00:01.970) 0:13:08.504 ******** ok: [managed-node13] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Thursday 16 April 2026 19:35:35 -0400 (0:00:02.869) 0:13:11.374 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Thursday 16 April 2026 19:35:35 -0400 (0:00:00.363) 0:13:11.737 ******** fatal: [managed-node13]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Thursday 16 April 2026 19:35:38 -0400 (0:00:02.434) 0:13:14.171 ******** fatal: [managed-node13]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "encrypted volume 'test1' missing key/password", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': False, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Thursday 16 April 2026 19:35:38 -0400 (0:00:00.351) 0:13:14.523 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Thursday 16 April 2026 19:35:39 -0400 (0:00:00.630) 0:13:15.154 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Thursday 16 April 2026 19:35:39 -0400 (0:00:00.283) 0:13:15.437 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Thursday 16 April 2026 19:35:39 -0400 (0:00:00.391) 0:13:15.829 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:370 Thursday 16 April 2026 19:35:40 -0400 (0:00:00.260) 0:13:16.090 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node13 TASK [Clear facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Thursday 16 April 2026 19:35:40 -0400 (0:00:00.369) 0:13:16.488 ******** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Thursday 16 April 2026 19:35:40 -0400 (0:00:00.023) 0:13:16.511 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Thursday 16 April 2026 19:35:40 -0400 (0:00:00.257) 0:13:16.768 ******** included: fedora.linux_system_roles.storage for managed-node13 TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 16 April 2026 19:35:41 -0400 (0:00:00.384) 0:13:17.153 ******** ok: [managed-node13] => { "ansible_facts": { "discovered_interpreter_python": "/usr/bin/python3.9" }, "changed": false, "message": "Message written to syslog" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:7 Thursday 16 April 2026 19:35:42 -0400 (0:00:01.171) 0:13:18.325 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 16 April 2026 19:35:42 -0400 (0:00:00.171) 0:13:18.497 ******** ok: [managed-node13] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 16 April 2026 19:35:44 -0400 (0:00:01.961) 0:13:20.458 ******** skipping: [managed-node13] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Thursday 16 April 2026 19:35:45 -0400 (0:00:00.725) 0:13:21.184 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Thursday 16 April 2026 19:35:45 -0400 (0:00:00.249) 0:13:21.433 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:10 Thursday 16 April 2026 19:35:45 -0400 (0:00:00.291) 0:13:21.724 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:14 Thursday 16 April 2026 19:35:45 -0400 (0:00:00.247) 0:13:21.972 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:18 Thursday 16 April 2026 19:35:46 -0400 (0:00:00.309) 0:13:22.282 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Thursday 16 April 2026 19:35:47 -0400 (0:00:00.867) 0:13:23.150 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Thursday 16 April 2026 19:35:47 -0400 (0:00:00.231) 0:13:23.381 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Thursday 16 April 2026 19:35:47 -0400 (0:00:00.258) 0:13:23.640 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Thursday 16 April 2026 19:35:49 -0400 (0:00:02.036) 0:13:25.676 ******** ok: [managed-node13] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Thursday 16 April 2026 19:35:50 -0400 (0:00:00.352) 0:13:26.029 ******** ok: [managed-node13] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Thursday 16 April 2026 19:35:50 -0400 (0:00:00.271) 0:13:26.300 ******** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Thursday 16 April 2026 19:35:52 -0400 (0:00:02.351) 0:13:28.652 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 16 April 2026 19:35:53 -0400 (0:00:00.434) 0:13:29.087 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 16 April 2026 19:35:53 -0400 (0:00:00.148) 0:13:29.235 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Thursday 16 April 2026 19:35:53 -0400 (0:00:00.188) 0:13:29.424 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Thursday 16 April 2026 19:35:53 -0400 (0:00:00.146) 0:13:29.570 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Thursday 16 April 2026 19:35:55 -0400 (0:00:01.741) 0:13:31.312 ******** ok: [managed-node13] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Thursday 16 April 2026 19:35:58 -0400 (0:00:02.841) 0:13:34.153 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Thursday 16 April 2026 19:35:58 -0400 (0:00:00.430) 0:13:34.583 ******** changed: [managed-node13] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Thursday 16 April 2026 19:36:09 -0400 (0:00:10.728) 0:13:45.312 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Thursday 16 April 2026 19:36:09 -0400 (0:00:00.151) 0:13:45.464 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382451.1088104, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a30f4e2b3cae7865f2a49814b9b84e18c0f904c9", "ctime": 1776382451.1048105, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 218104009, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776382451.1048105, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1490, "uid": 0, "version": "292099060", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Thursday 16 April 2026 19:36:10 -0400 (0:00:01.161) 0:13:46.625 ******** ok: [managed-node13] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Thursday 16 April 2026 19:36:11 -0400 (0:00:01.294) 0:13:47.920 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Thursday 16 April 2026 19:36:12 -0400 (0:00:00.541) 0:13:48.462 ******** ok: [managed-node13] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "state": "mounted" } ], "packages": [ "xfsprogs", "cryptsetup", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 16 April 2026 19:36:12 -0400 (0:00:00.368) 0:13:48.830 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Thursday 16 April 2026 19:36:13 -0400 (0:00:00.299) 0:13:49.130 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Thursday 16 April 2026 19:36:13 -0400 (0:00:00.322) 0:13:49.453 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node13] => (item={'src': '/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Thursday 16 April 2026 19:36:15 -0400 (0:00:01.620) 0:13:51.073 ******** ok: [managed-node13] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Thursday 16 April 2026 19:36:16 -0400 (0:00:01.604) 0:13:52.678 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node13] => (item={'src': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Thursday 16 April 2026 19:36:17 -0400 (0:00:01.235) 0:13:53.914 ******** skipping: [managed-node13] => (item={'src': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Thursday 16 April 2026 19:36:18 -0400 (0:00:00.436) 0:13:54.350 ******** ok: [managed-node13] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Thursday 16 April 2026 19:36:19 -0400 (0:00:01.618) 0:13:55.969 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382462.497813, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "71e48526c1aa40869ce88b62d99899f2a28590f3", "ctime": 1776382455.6618114, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 390070470, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776382455.661663, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 85, "uid": 0, "version": "2474197557", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Thursday 16 April 2026 19:36:21 -0400 (0:00:01.165) 0:13:57.134 ******** changed: [managed-node13] => (item={'backing_device': '/dev/sda1', 'name': 'luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node13] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Thursday 16 April 2026 19:36:23 -0400 (0:00:02.453) 0:13:59.588 ******** ok: [managed-node13] TASK [Verify role results - 7] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:388 Thursday 16 April 2026 19:36:25 -0400 (0:00:01.897) 0:14:01.486 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node13 TASK [Print out pool information] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Thursday 16 April 2026 19:36:26 -0400 (0:00:00.562) 0:14:02.048 ******** ok: [managed-node13] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Thursday 16 April 2026 19:36:26 -0400 (0:00:00.367) 0:14:02.416 ******** skipping: [managed-node13] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Thursday 16 April 2026 19:36:26 -0400 (0:00:00.287) 0:14:02.703 ******** ok: [managed-node13] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "1edafecf-4d8e-4f78-aff4-ede6c0f23479" }, "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "size": "4G", "type": "crypt", "uuid": "120ce95f-1f7e-4203-a9a7-bd6636b58678" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "CeOF2F-JH5x-FPcS-IIeE-QvP6-4d1m-bl1dga" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fb6c01b6-de24-4b5b-a70c-3231ceae3fbd" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Thursday 16 April 2026 19:36:28 -0400 (0:00:01.388) 0:14:04.092 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003007", "end": "2026-04-16 19:36:29.282958", "rc": 0, "start": "2026-04-16 19:36:29.279951" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Apr 14 06:59:53 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fb6c01b6-de24-4b5b-a70c-3231ceae3fbd / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Thursday 16 April 2026 19:36:29 -0400 (0:00:01.333) 0:14:05.426 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003199", "end": "2026-04-16 19:36:30.509141", "failed_when_result": false, "rc": 0, "start": "2026-04-16 19:36:30.505942" } STDOUT: luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Thursday 16 April 2026 19:36:30 -0400 (0:00:01.288) 0:14:06.714 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node13 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Thursday 16 April 2026 19:36:31 -0400 (0:00:00.524) 0:14:07.239 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Thursday 16 April 2026 19:36:31 -0400 (0:00:00.212) 0:14:07.451 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.029070", "end": "2026-04-16 19:36:32.647874", "rc": 0, "start": "2026-04-16 19:36:32.618804" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Thursday 16 April 2026 19:36:32 -0400 (0:00:01.432) 0:14:08.884 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Thursday 16 April 2026 19:36:34 -0400 (0:00:01.612) 0:14:10.497 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node13 => (item=members) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node13 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Thursday 16 April 2026 19:36:35 -0400 (0:00:00.650) 0:14:11.147 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Thursday 16 April 2026 19:36:35 -0400 (0:00:00.514) 0:14:11.662 ******** ok: [managed-node13] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Thursday 16 April 2026 19:36:38 -0400 (0:00:03.186) 0:14:14.848 ******** ok: [managed-node13] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Thursday 16 April 2026 19:36:39 -0400 (0:00:00.308) 0:14:15.157 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Thursday 16 April 2026 19:36:39 -0400 (0:00:00.407) 0:14:15.565 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Thursday 16 April 2026 19:36:39 -0400 (0:00:00.362) 0:14:15.927 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Thursday 16 April 2026 19:36:40 -0400 (0:00:00.280) 0:14:16.208 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Thursday 16 April 2026 19:36:40 -0400 (0:00:00.326) 0:14:16.535 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Thursday 16 April 2026 19:36:40 -0400 (0:00:00.382) 0:14:16.917 ******** ok: [managed-node13] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Thursday 16 April 2026 19:36:41 -0400 (0:00:00.453) 0:14:17.371 ******** ok: [managed-node13] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:105611): WARNING **: 19:36:42.484: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.5 27 Jan 2026 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.43.82 originally 10.31.43.82 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.43.82 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.43.82 originally 10.31.43.82 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/983a9e969b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.43.82 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Thursday 16 April 2026 19:36:42 -0400 (0:00:01.426) 0:14:18.797 ******** skipping: [managed-node13] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "storage_test_pool.grow_to_fill | bool", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Thursday 16 April 2026 19:36:43 -0400 (0:00:00.400) 0:14:19.198 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node13 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Thursday 16 April 2026 19:36:43 -0400 (0:00:00.675) 0:14:19.873 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Thursday 16 April 2026 19:36:44 -0400 (0:00:00.254) 0:14:20.128 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Thursday 16 April 2026 19:36:44 -0400 (0:00:00.254) 0:14:20.382 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Thursday 16 April 2026 19:36:44 -0400 (0:00:00.232) 0:14:20.615 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Thursday 16 April 2026 19:36:44 -0400 (0:00:00.227) 0:14:20.843 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Thursday 16 April 2026 19:36:45 -0400 (0:00:00.260) 0:14:21.103 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Thursday 16 April 2026 19:36:45 -0400 (0:00:00.232) 0:14:21.335 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Thursday 16 April 2026 19:36:45 -0400 (0:00:00.232) 0:14:21.568 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Thursday 16 April 2026 19:36:45 -0400 (0:00:00.277) 0:14:21.846 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Thursday 16 April 2026 19:36:46 -0400 (0:00:00.249) 0:14:22.096 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Thursday 16 April 2026 19:36:46 -0400 (0:00:00.212) 0:14:22.308 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Thursday 16 April 2026 19:36:46 -0400 (0:00:00.277) 0:14:22.585 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node13 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Thursday 16 April 2026 19:36:47 -0400 (0:00:00.662) 0:14:23.248 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node13 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Thursday 16 April 2026 19:36:47 -0400 (0:00:00.623) 0:14:23.872 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Thursday 16 April 2026 19:36:48 -0400 (0:00:00.347) 0:14:24.219 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Thursday 16 April 2026 19:36:48 -0400 (0:00:00.380) 0:14:24.599 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Thursday 16 April 2026 19:36:48 -0400 (0:00:00.289) 0:14:24.889 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Thursday 16 April 2026 19:36:49 -0400 (0:00:00.313) 0:14:25.202 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Thursday 16 April 2026 19:36:49 -0400 (0:00:00.313) 0:14:25.516 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Thursday 16 April 2026 19:36:49 -0400 (0:00:00.282) 0:14:25.799 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Thursday 16 April 2026 19:36:50 -0400 (0:00:00.306) 0:14:26.105 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node13 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Thursday 16 April 2026 19:36:50 -0400 (0:00:00.667) 0:14:26.772 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node13 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Thursday 16 April 2026 19:36:51 -0400 (0:00:00.622) 0:14:27.394 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Thursday 16 April 2026 19:36:51 -0400 (0:00:00.159) 0:14:27.554 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Thursday 16 April 2026 19:36:51 -0400 (0:00:00.183) 0:14:27.738 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Thursday 16 April 2026 19:36:51 -0400 (0:00:00.220) 0:14:27.959 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Thursday 16 April 2026 19:36:52 -0400 (0:00:00.261) 0:14:28.220 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node13 TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Thursday 16 April 2026 19:36:52 -0400 (0:00:00.740) 0:14:28.961 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Thursday 16 April 2026 19:36:53 -0400 (0:00:00.361) 0:14:29.322 ******** skipping: [managed-node13] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Thursday 16 April 2026 19:36:53 -0400 (0:00:00.211) 0:14:29.534 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node13 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Thursday 16 April 2026 19:36:53 -0400 (0:00:00.479) 0:14:30.013 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Thursday 16 April 2026 19:36:54 -0400 (0:00:00.336) 0:14:30.354 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Thursday 16 April 2026 19:36:54 -0400 (0:00:00.436) 0:14:30.790 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Thursday 16 April 2026 19:36:55 -0400 (0:00:00.248) 0:14:31.038 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Thursday 16 April 2026 19:36:55 -0400 (0:00:00.266) 0:14:31.304 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Thursday 16 April 2026 19:36:55 -0400 (0:00:00.370) 0:14:31.675 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Thursday 16 April 2026 19:36:55 -0400 (0:00:00.257) 0:14:31.933 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Thursday 16 April 2026 19:36:56 -0400 (0:00:00.265) 0:14:32.198 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node13 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Thursday 16 April 2026 19:36:56 -0400 (0:00:00.819) 0:14:33.018 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node13 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Thursday 16 April 2026 19:36:57 -0400 (0:00:00.706) 0:14:33.725 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Thursday 16 April 2026 19:36:57 -0400 (0:00:00.202) 0:14:33.927 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Thursday 16 April 2026 19:36:58 -0400 (0:00:00.240) 0:14:34.168 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Thursday 16 April 2026 19:36:58 -0400 (0:00:00.262) 0:14:34.431 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Thursday 16 April 2026 19:36:58 -0400 (0:00:00.250) 0:14:34.681 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Thursday 16 April 2026 19:36:58 -0400 (0:00:00.234) 0:14:34.916 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Thursday 16 April 2026 19:36:59 -0400 (0:00:00.254) 0:14:35.170 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Thursday 16 April 2026 19:36:59 -0400 (0:00:00.270) 0:14:35.441 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node13 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Thursday 16 April 2026 19:37:00 -0400 (0:00:00.808) 0:14:36.249 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Thursday 16 April 2026 19:37:00 -0400 (0:00:00.253) 0:14:36.502 ******** skipping: [managed-node13] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Thursday 16 April 2026 19:37:00 -0400 (0:00:00.239) 0:14:36.742 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Thursday 16 April 2026 19:37:00 -0400 (0:00:00.190) 0:14:36.933 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Thursday 16 April 2026 19:37:01 -0400 (0:00:00.221) 0:14:37.154 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Thursday 16 April 2026 19:37:01 -0400 (0:00:00.223) 0:14:37.378 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Thursday 16 April 2026 19:37:01 -0400 (0:00:00.307) 0:14:37.686 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Thursday 16 April 2026 19:37:01 -0400 (0:00:00.238) 0:14:37.924 ******** ok: [managed-node13] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Thursday 16 April 2026 19:37:02 -0400 (0:00:00.278) 0:14:38.203 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node13 => (item={'encryption': True, 'encryption_cipher': 'aes-xts-plain64', 'encryption_key': None, 'encryption_key_size': 512, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': [], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Thursday 16 April 2026 19:37:02 -0400 (0:00:00.614) 0:14:38.817 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Thursday 16 April 2026 19:37:03 -0400 (0:00:00.408) 0:14:39.226 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node13 => (item=mount) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node13 => (item=fstab) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node13 => (item=fs) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node13 => (item=device) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node13 => (item=encryption) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node13 => (item=md) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node13 => (item=size) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node13 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Thursday 16 April 2026 19:37:05 -0400 (0:00:02.348) 0:14:41.575 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Thursday 16 April 2026 19:37:05 -0400 (0:00:00.396) 0:14:41.971 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Thursday 16 April 2026 19:37:06 -0400 (0:00:00.476) 0:14:42.448 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Thursday 16 April 2026 19:37:07 -0400 (0:00:00.597) 0:14:43.046 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Thursday 16 April 2026 19:37:07 -0400 (0:00:00.435) 0:14:43.481 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Thursday 16 April 2026 19:37:07 -0400 (0:00:00.428) 0:14:43.910 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Thursday 16 April 2026 19:37:08 -0400 (0:00:00.369) 0:14:44.280 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Thursday 16 April 2026 19:37:08 -0400 (0:00:00.507) 0:14:44.787 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Thursday 16 April 2026 19:37:09 -0400 (0:00:00.256) 0:14:45.044 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Thursday 16 April 2026 19:37:09 -0400 (0:00:00.243) 0:14:45.287 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Thursday 16 April 2026 19:37:09 -0400 (0:00:00.203) 0:14:45.490 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Thursday 16 April 2026 19:37:09 -0400 (0:00:00.256) 0:14:45.747 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Thursday 16 April 2026 19:37:10 -0400 (0:00:00.787) 0:14:46.535 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Thursday 16 April 2026 19:37:10 -0400 (0:00:00.314) 0:14:46.849 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Thursday 16 April 2026 19:37:11 -0400 (0:00:00.247) 0:14:47.097 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Thursday 16 April 2026 19:37:11 -0400 (0:00:00.295) 0:14:47.392 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Thursday 16 April 2026 19:37:11 -0400 (0:00:00.420) 0:14:47.812 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Thursday 16 April 2026 19:37:12 -0400 (0:00:00.236) 0:14:48.049 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Thursday 16 April 2026 19:37:12 -0400 (0:00:00.471) 0:14:48.521 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Thursday 16 April 2026 19:37:12 -0400 (0:00:00.454) 0:14:49.006 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382568.752838, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1776382568.752838, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1913, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776382568.752838, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Thursday 16 April 2026 19:37:14 -0400 (0:00:01.461) 0:14:50.467 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Thursday 16 April 2026 19:37:14 -0400 (0:00:00.365) 0:14:50.833 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Thursday 16 April 2026 19:37:15 -0400 (0:00:00.296) 0:14:51.129 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Thursday 16 April 2026 19:37:15 -0400 (0:00:00.424) 0:14:51.553 ******** ok: [managed-node13] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Thursday 16 April 2026 19:37:15 -0400 (0:00:00.311) 0:14:51.865 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Thursday 16 April 2026 19:37:16 -0400 (0:00:00.220) 0:14:52.086 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Thursday 16 April 2026 19:37:16 -0400 (0:00:00.231) 0:14:52.317 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382568.990838, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1776382568.990838, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1961, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776382568.990838, "nlink": 1, "path": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Thursday 16 April 2026 19:37:17 -0400 (0:00:01.390) 0:14:53.708 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Thursday 16 April 2026 19:37:19 -0400 (0:00:01.974) 0:14:55.683 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.006873", "end": "2026-04-16 19:37:20.784174", "rc": 0, "start": "2026-04-16 19:37:20.777301" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 16384 MK bits: 512 MK digest: 68 99 ea 8d 5b 5e e9 da 9c 7d db 42 56 24 5d d5 78 4d 2c d1 MK salt: e3 9e 29 cf 53 6f 01 4d 79 14 e1 a8 41 f9 5e e2 81 97 d2 45 c5 13 25 e2 aa fb 0e ac b4 b0 4f 4a MK iterations: 132129 UUID: 1edafecf-4d8e-4f78-aff4-ede6c0f23479 Key Slot 0: ENABLED Iterations: 2111934 Salt: 71 1d f4 4e 82 b8 ef 6c 68 8c a7 6f c6 41 76 b8 4b 8c 5c a0 0e fb 8d e1 6a ff 1f 96 66 1a 31 d1 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Thursday 16 April 2026 19:37:20 -0400 (0:00:01.298) 0:14:56.981 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Thursday 16 April 2026 19:37:21 -0400 (0:00:00.352) 0:14:57.333 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Thursday 16 April 2026 19:37:21 -0400 (0:00:00.361) 0:14:57.695 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Thursday 16 April 2026 19:37:22 -0400 (0:00:00.367) 0:14:58.062 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Thursday 16 April 2026 19:37:22 -0400 (0:00:00.370) 0:14:58.433 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Thursday 16 April 2026 19:37:22 -0400 (0:00:00.448) 0:14:58.881 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Thursday 16 April 2026 19:37:23 -0400 (0:00:00.441) 0:14:59.323 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Thursday 16 April 2026 19:37:23 -0400 (0:00:00.523) 0:14:59.846 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Thursday 16 April 2026 19:37:24 -0400 (0:00:00.526) 0:15:00.373 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Thursday 16 April 2026 19:37:24 -0400 (0:00:00.349) 0:15:00.722 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Thursday 16 April 2026 19:37:25 -0400 (0:00:00.458) 0:15:01.181 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Thursday 16 April 2026 19:37:25 -0400 (0:00:00.382) 0:15:01.563 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Thursday 16 April 2026 19:37:25 -0400 (0:00:00.436) 0:15:02.000 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Thursday 16 April 2026 19:37:26 -0400 (0:00:00.234) 0:15:02.234 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Thursday 16 April 2026 19:37:26 -0400 (0:00:00.254) 0:15:02.489 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Thursday 16 April 2026 19:37:26 -0400 (0:00:00.165) 0:15:02.655 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Thursday 16 April 2026 19:37:26 -0400 (0:00:00.289) 0:15:02.944 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Thursday 16 April 2026 19:37:27 -0400 (0:00:00.245) 0:15:03.190 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Thursday 16 April 2026 19:37:27 -0400 (0:00:00.225) 0:15:03.417 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Thursday 16 April 2026 19:37:27 -0400 (0:00:00.239) 0:15:03.656 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Thursday 16 April 2026 19:37:27 -0400 (0:00:00.233) 0:15:03.889 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Thursday 16 April 2026 19:37:28 -0400 (0:00:00.223) 0:15:04.113 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Thursday 16 April 2026 19:37:28 -0400 (0:00:00.278) 0:15:04.392 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Thursday 16 April 2026 19:37:28 -0400 (0:00:00.258) 0:15:04.651 ******** ok: [managed-node13] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Thursday 16 April 2026 19:37:32 -0400 (0:00:04.151) 0:15:08.802 ******** ok: [managed-node13] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Thursday 16 April 2026 19:37:34 -0400 (0:00:01.374) 0:15:10.176 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Thursday 16 April 2026 19:37:34 -0400 (0:00:00.316) 0:15:10.493 ******** ok: [managed-node13] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Thursday 16 April 2026 19:37:34 -0400 (0:00:00.317) 0:15:10.810 ******** ok: [managed-node13] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Thursday 16 April 2026 19:37:36 -0400 (0:00:01.234) 0:15:12.045 ******** skipping: [managed-node13] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Thursday 16 April 2026 19:37:36 -0400 (0:00:00.275) 0:15:12.321 ******** skipping: [managed-node13] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Thursday 16 April 2026 19:37:36 -0400 (0:00:00.329) 0:15:12.650 ******** skipping: [managed-node13] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Thursday 16 April 2026 19:37:36 -0400 (0:00:00.356) 0:15:13.006 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Thursday 16 April 2026 19:37:37 -0400 (0:00:00.267) 0:15:13.274 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Thursday 16 April 2026 19:37:37 -0400 (0:00:00.299) 0:15:13.574 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Thursday 16 April 2026 19:37:37 -0400 (0:00:00.390) 0:15:13.965 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Thursday 16 April 2026 19:37:38 -0400 (0:00:00.262) 0:15:14.233 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Thursday 16 April 2026 19:37:38 -0400 (0:00:00.229) 0:15:14.463 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Thursday 16 April 2026 19:37:38 -0400 (0:00:00.319) 0:15:14.782 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Thursday 16 April 2026 19:37:39 -0400 (0:00:00.368) 0:15:15.151 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Thursday 16 April 2026 19:37:39 -0400 (0:00:00.217) 0:15:15.368 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Thursday 16 April 2026 19:37:39 -0400 (0:00:00.252) 0:15:15.621 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Thursday 16 April 2026 19:37:39 -0400 (0:00:00.331) 0:15:15.953 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Thursday 16 April 2026 19:37:40 -0400 (0:00:00.309) 0:15:16.263 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Thursday 16 April 2026 19:37:40 -0400 (0:00:00.318) 0:15:16.589 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Thursday 16 April 2026 19:37:40 -0400 (0:00:00.254) 0:15:16.843 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Thursday 16 April 2026 19:37:41 -0400 (0:00:00.335) 0:15:17.179 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Thursday 16 April 2026 19:37:41 -0400 (0:00:00.307) 0:15:17.486 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Thursday 16 April 2026 19:37:41 -0400 (0:00:00.425) 0:15:17.912 ******** ok: [managed-node13] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Thursday 16 April 2026 19:37:42 -0400 (0:00:00.296) 0:15:18.208 ******** ok: [managed-node13] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Thursday 16 April 2026 19:37:42 -0400 (0:00:00.298) 0:15:18.507 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Thursday 16 April 2026 19:37:42 -0400 (0:00:00.430) 0:15:18.937 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.032323", "end": "2026-04-16 19:37:43.879054", "rc": 0, "start": "2026-04-16 19:37:43.846731" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Thursday 16 April 2026 19:37:44 -0400 (0:00:01.125) 0:15:20.062 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Thursday 16 April 2026 19:37:44 -0400 (0:00:00.386) 0:15:20.449 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Thursday 16 April 2026 19:37:44 -0400 (0:00:00.298) 0:15:20.747 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Thursday 16 April 2026 19:37:44 -0400 (0:00:00.234) 0:15:20.982 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Thursday 16 April 2026 19:37:45 -0400 (0:00:00.287) 0:15:21.269 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Thursday 16 April 2026 19:37:45 -0400 (0:00:00.223) 0:15:21.493 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Thursday 16 April 2026 19:37:45 -0400 (0:00:00.209) 0:15:21.703 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Thursday 16 April 2026 19:37:46 -0400 (0:00:00.357) 0:15:22.060 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Thursday 16 April 2026 19:37:47 -0400 (0:00:01.363) 0:15:23.423 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:391 Thursday 16 April 2026 19:37:47 -0400 (0:00:00.282) 0:15:23.705 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node13 TASK [Clear facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Thursday 16 April 2026 19:37:48 -0400 (0:00:00.493) 0:15:24.199 ******** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Thursday 16 April 2026 19:37:48 -0400 (0:00:00.023) 0:15:24.223 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Thursday 16 April 2026 19:37:48 -0400 (0:00:00.254) 0:15:24.478 ******** included: fedora.linux_system_roles.storage for managed-node13 TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 16 April 2026 19:37:48 -0400 (0:00:00.442) 0:15:24.920 ******** ok: [managed-node13] => { "ansible_facts": { "discovered_interpreter_python": "/usr/bin/python3.9" }, "changed": false, "message": "Message written to syslog" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:7 Thursday 16 April 2026 19:37:50 -0400 (0:00:01.133) 0:15:26.053 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 16 April 2026 19:37:50 -0400 (0:00:00.259) 0:15:26.313 ******** ok: [managed-node13] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 16 April 2026 19:37:52 -0400 (0:00:01.769) 0:15:28.082 ******** skipping: [managed-node13] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Thursday 16 April 2026 19:37:52 -0400 (0:00:00.631) 0:15:28.713 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Thursday 16 April 2026 19:37:52 -0400 (0:00:00.307) 0:15:29.021 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:10 Thursday 16 April 2026 19:37:53 -0400 (0:00:00.203) 0:15:29.224 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:14 Thursday 16 April 2026 19:37:53 -0400 (0:00:00.159) 0:15:29.384 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:18 Thursday 16 April 2026 19:37:53 -0400 (0:00:00.295) 0:15:29.680 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Thursday 16 April 2026 19:37:54 -0400 (0:00:00.613) 0:15:30.294 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Thursday 16 April 2026 19:37:54 -0400 (0:00:00.251) 0:15:30.545 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Thursday 16 April 2026 19:37:54 -0400 (0:00:00.206) 0:15:30.751 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Thursday 16 April 2026 19:37:56 -0400 (0:00:01.771) 0:15:32.523 ******** ok: [managed-node13] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Thursday 16 April 2026 19:37:56 -0400 (0:00:00.298) 0:15:32.822 ******** ok: [managed-node13] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Thursday 16 April 2026 19:37:57 -0400 (0:00:00.243) 0:15:33.065 ******** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Thursday 16 April 2026 19:37:59 -0400 (0:00:02.022) 0:15:35.087 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 16 April 2026 19:37:59 -0400 (0:00:00.321) 0:15:35.409 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 16 April 2026 19:37:59 -0400 (0:00:00.207) 0:15:35.617 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Thursday 16 April 2026 19:37:59 -0400 (0:00:00.269) 0:15:35.887 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Thursday 16 April 2026 19:38:00 -0400 (0:00:00.155) 0:15:36.043 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Thursday 16 April 2026 19:38:01 -0400 (0:00:01.777) 0:15:37.821 ******** ok: [managed-node13] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d691f6e5f\\x2d6f09\\x2d4246\\x2da8c2\\x2d4a5b014208bb.service": { "name": "systemd-cryptsetup@luks\\x2d691f6e5f\\x2d6f09\\x2d4246\\x2da8c2\\x2d4a5b014208bb.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Thursday 16 April 2026 19:38:04 -0400 (0:00:02.826) 0:15:40.647 ******** changed: [managed-node13] => (item=systemd-cryptsetup@luks\x2d691f6e5f\x2d6f09\x2d4246\x2da8c2\x2d4a5b014208bb.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d691f6e5f\\x2d6f09\\x2d4246\\x2da8c2\\x2d4a5b014208bb.service", "name": "systemd-cryptsetup@luks\\x2d691f6e5f\\x2d6f09\\x2d4246\\x2da8c2\\x2d4a5b014208bb.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "dev-sda1.device cryptsetup-pre.target -.mount \"system-systemd\\\\x2dcryptsetup.slice\" systemd-journald.socket tmp.mount systemd-udevd-kernel.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "\"blockdev@dev-mapper-luks\\\\x2d691f6e5f\\\\x2d6f09\\\\x2d4246\\\\x2da8c2\\\\x2d4a5b014208bb.target\" cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb /dev/sda1 /tmp/storage_test17bktr_slukskey ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb /dev/sda1 /tmp/storage_test17bktr_slukskey ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-691f6e5f-6f09-4246-a8c2-4a5b014208bb ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d691f6e5f\\x2d6f09\\x2d4246\\x2da8c2\\x2d4a5b014208bb.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d691f6e5f\\x2d6f09\\x2d4246\\x2da8c2\\x2d4a5b014208bb.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13683", "LimitNPROCSoft": "13683", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13683", "LimitSIGPENDINGSoft": "13683", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d691f6e5f\\\\x2d6f09\\\\x2d4246\\\\x2da8c2\\\\x2d4a5b014208bb.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "-.mount \"system-systemd\\\\x2dcryptsetup.slice\"", "RequiresMountsFor": "/tmp/storage_test17bktr_slukskey", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2026-04-16 19:36:19 EDT", "StateChangeTimestampMonotonic": "2390141630", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21893", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d691f6e5f\\\\x2d6f09\\\\x2d4246\\\\x2da8c2\\\\x2d4a5b014208bb.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Thursday 16 April 2026 19:38:06 -0400 (0:00:01.738) 0:15:42.386 ******** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Thursday 16 April 2026 19:38:08 -0400 (0:00:02.370) 0:15:44.756 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Thursday 16 April 2026 19:38:08 -0400 (0:00:00.224) 0:15:44.981 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382577.73084, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "626483cc4af1733e5b9d55fb82a59a61ed6ea3de", "ctime": 1776382577.72684, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 218104009, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776382577.72684, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1490, "uid": 0, "version": "292099060", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Thursday 16 April 2026 19:38:10 -0400 (0:00:01.096) 0:15:46.077 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "blivet_output is changed", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Thursday 16 April 2026 19:38:10 -0400 (0:00:00.293) 0:15:46.371 ******** changed: [managed-node13] => (item=systemd-cryptsetup@luks\x2d691f6e5f\x2d6f09\x2d4246\x2da8c2\x2d4a5b014208bb.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d691f6e5f\\x2d6f09\\x2d4246\\x2da8c2\\x2d4a5b014208bb.service", "name": "systemd-cryptsetup@luks\\x2d691f6e5f\\x2d6f09\\x2d4246\\x2da8c2\\x2d4a5b014208bb.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d691f6e5f\\x2d6f09\\x2d4246\\x2da8c2\\x2d4a5b014208bb.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d691f6e5f\\x2d6f09\\x2d4246\\x2da8c2\\x2d4a5b014208bb.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d691f6e5f\\x2d6f09\\x2d4246\\x2da8c2\\x2d4a5b014208bb.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454168576", "LimitMEMLOCKSoft": "454168576", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13683", "LimitNPROCSoft": "13683", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13683", "LimitSIGPENDINGSoft": "13683", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d691f6e5f\\x2d6f09\\x2d4246\\x2da8c2\\x2d4a5b014208bb.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d691f6e5f\\\\x2d6f09\\\\x2d4246\\\\x2da8c2\\\\x2d4a5b014208bb.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21893", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Thursday 16 April 2026 19:38:12 -0400 (0:00:02.093) 0:15:48.464 ******** ok: [managed-node13] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 16 April 2026 19:38:12 -0400 (0:00:00.446) 0:15:48.910 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Thursday 16 April 2026 19:38:13 -0400 (0:00:00.273) 0:15:49.184 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Thursday 16 April 2026 19:38:13 -0400 (0:00:00.277) 0:15:49.461 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Thursday 16 April 2026 19:38:13 -0400 (0:00:00.247) 0:15:49.709 ******** ok: [managed-node13] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Thursday 16 April 2026 19:38:15 -0400 (0:00:01.773) 0:15:51.482 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount ok: [managed-node13] => (item={'src': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Thursday 16 April 2026 19:38:16 -0400 (0:00:01.413) 0:15:52.895 ******** skipping: [managed-node13] => (item={'src': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Thursday 16 April 2026 19:38:17 -0400 (0:00:00.396) 0:15:53.292 ******** ok: [managed-node13] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Thursday 16 April 2026 19:38:18 -0400 (0:00:01.707) 0:15:54.999 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382590.507843, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "4b23f0039fa353d905d020ad1b0d9e45dc1129c7", "ctime": 1776382583.3958414, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 666894680, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776382583.39697, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "3986624464", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Thursday 16 April 2026 19:38:20 -0400 (0:00:01.338) 0:15:56.338 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Thursday 16 April 2026 19:38:20 -0400 (0:00:00.175) 0:15:56.514 ******** ok: [managed-node13] TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:404 Thursday 16 April 2026 19:38:22 -0400 (0:00:01.941) 0:15:58.455 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify role results - 8] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:411 Thursday 16 April 2026 19:38:22 -0400 (0:00:00.345) 0:15:58.800 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node13 TASK [Print out pool information] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Thursday 16 April 2026 19:38:23 -0400 (0:00:00.517) 0:15:59.318 ******** ok: [managed-node13] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Thursday 16 April 2026 19:38:23 -0400 (0:00:00.436) 0:15:59.755 ******** skipping: [managed-node13] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Thursday 16 April 2026 19:38:23 -0400 (0:00:00.228) 0:15:59.983 ******** ok: [managed-node13] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "1edafecf-4d8e-4f78-aff4-ede6c0f23479" }, "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "size": "4G", "type": "crypt", "uuid": "120ce95f-1f7e-4203-a9a7-bd6636b58678" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "CeOF2F-JH5x-FPcS-IIeE-QvP6-4d1m-bl1dga" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fb6c01b6-de24-4b5b-a70c-3231ceae3fbd" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Thursday 16 April 2026 19:38:25 -0400 (0:00:01.185) 0:16:01.169 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003085", "end": "2026-04-16 19:38:26.102921", "rc": 0, "start": "2026-04-16 19:38:26.099836" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Apr 14 06:59:53 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fb6c01b6-de24-4b5b-a70c-3231ceae3fbd / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Thursday 16 April 2026 19:38:26 -0400 (0:00:01.128) 0:16:02.302 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002993", "end": "2026-04-16 19:38:27.188173", "failed_when_result": false, "rc": 0, "start": "2026-04-16 19:38:27.185180" } STDOUT: luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Thursday 16 April 2026 19:38:27 -0400 (0:00:01.077) 0:16:03.380 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node13 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Thursday 16 April 2026 19:38:28 -0400 (0:00:00.656) 0:16:04.037 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Thursday 16 April 2026 19:38:28 -0400 (0:00:00.270) 0:16:04.307 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.029594", "end": "2026-04-16 19:38:29.307359", "rc": 0, "start": "2026-04-16 19:38:29.277765" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Thursday 16 April 2026 19:38:29 -0400 (0:00:01.189) 0:16:05.497 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Thursday 16 April 2026 19:38:29 -0400 (0:00:00.346) 0:16:05.843 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node13 => (item=members) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node13 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Thursday 16 April 2026 19:38:30 -0400 (0:00:00.610) 0:16:06.454 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Thursday 16 April 2026 19:38:31 -0400 (0:00:00.637) 0:16:07.092 ******** ok: [managed-node13] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Thursday 16 April 2026 19:38:32 -0400 (0:00:01.290) 0:16:08.382 ******** ok: [managed-node13] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Thursday 16 April 2026 19:38:32 -0400 (0:00:00.367) 0:16:08.750 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Thursday 16 April 2026 19:38:33 -0400 (0:00:00.431) 0:16:09.182 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Thursday 16 April 2026 19:38:33 -0400 (0:00:00.385) 0:16:09.567 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Thursday 16 April 2026 19:38:33 -0400 (0:00:00.258) 0:16:09.826 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Thursday 16 April 2026 19:38:34 -0400 (0:00:00.392) 0:16:10.219 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Thursday 16 April 2026 19:38:34 -0400 (0:00:00.317) 0:16:10.554 ******** ok: [managed-node13] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Thursday 16 April 2026 19:38:34 -0400 (0:00:00.386) 0:16:10.940 ******** ok: [managed-node13] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:110475): WARNING **: 19:38:35.890: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.5 27 Jan 2026 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.43.82 originally 10.31.43.82 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.43.82 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.43.82 originally 10.31.43.82 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/983a9e969b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.43.82 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Thursday 16 April 2026 19:38:36 -0400 (0:00:01.247) 0:16:12.188 ******** skipping: [managed-node13] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "storage_test_pool.grow_to_fill | bool", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Thursday 16 April 2026 19:38:36 -0400 (0:00:00.404) 0:16:12.592 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node13 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Thursday 16 April 2026 19:38:37 -0400 (0:00:00.629) 0:16:13.222 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Thursday 16 April 2026 19:38:37 -0400 (0:00:00.305) 0:16:13.527 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Thursday 16 April 2026 19:38:37 -0400 (0:00:00.282) 0:16:13.810 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Thursday 16 April 2026 19:38:38 -0400 (0:00:00.287) 0:16:14.098 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Thursday 16 April 2026 19:38:38 -0400 (0:00:00.271) 0:16:14.369 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Thursday 16 April 2026 19:38:38 -0400 (0:00:00.226) 0:16:14.596 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Thursday 16 April 2026 19:38:38 -0400 (0:00:00.175) 0:16:14.771 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Thursday 16 April 2026 19:38:38 -0400 (0:00:00.253) 0:16:15.024 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Thursday 16 April 2026 19:38:39 -0400 (0:00:00.184) 0:16:15.209 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Thursday 16 April 2026 19:38:39 -0400 (0:00:00.225) 0:16:15.435 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Thursday 16 April 2026 19:38:39 -0400 (0:00:00.240) 0:16:15.675 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Thursday 16 April 2026 19:38:39 -0400 (0:00:00.255) 0:16:15.930 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node13 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Thursday 16 April 2026 19:38:40 -0400 (0:00:00.774) 0:16:16.705 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node13 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Thursday 16 April 2026 19:38:41 -0400 (0:00:00.571) 0:16:17.276 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Thursday 16 April 2026 19:38:41 -0400 (0:00:00.308) 0:16:17.585 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Thursday 16 April 2026 19:38:41 -0400 (0:00:00.348) 0:16:17.934 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Thursday 16 April 2026 19:38:42 -0400 (0:00:00.287) 0:16:18.222 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Thursday 16 April 2026 19:38:42 -0400 (0:00:00.264) 0:16:18.486 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Thursday 16 April 2026 19:38:42 -0400 (0:00:00.271) 0:16:18.758 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Thursday 16 April 2026 19:38:43 -0400 (0:00:00.362) 0:16:19.120 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Thursday 16 April 2026 19:38:43 -0400 (0:00:00.238) 0:16:19.358 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node13 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Thursday 16 April 2026 19:38:43 -0400 (0:00:00.631) 0:16:19.990 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node13 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Thursday 16 April 2026 19:38:44 -0400 (0:00:00.425) 0:16:20.415 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Thursday 16 April 2026 19:38:44 -0400 (0:00:00.222) 0:16:20.638 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Thursday 16 April 2026 19:38:44 -0400 (0:00:00.246) 0:16:20.885 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Thursday 16 April 2026 19:38:45 -0400 (0:00:00.285) 0:16:21.170 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Thursday 16 April 2026 19:38:45 -0400 (0:00:00.290) 0:16:21.461 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node13 TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Thursday 16 April 2026 19:38:46 -0400 (0:00:00.712) 0:16:22.174 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Thursday 16 April 2026 19:38:46 -0400 (0:00:00.393) 0:16:22.567 ******** skipping: [managed-node13] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Thursday 16 April 2026 19:38:46 -0400 (0:00:00.209) 0:16:22.777 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node13 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Thursday 16 April 2026 19:38:47 -0400 (0:00:00.521) 0:16:23.298 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Thursday 16 April 2026 19:38:47 -0400 (0:00:00.446) 0:16:23.745 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Thursday 16 April 2026 19:38:48 -0400 (0:00:00.479) 0:16:24.224 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Thursday 16 April 2026 19:38:48 -0400 (0:00:00.265) 0:16:24.490 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Thursday 16 April 2026 19:38:48 -0400 (0:00:00.221) 0:16:24.712 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Thursday 16 April 2026 19:38:48 -0400 (0:00:00.233) 0:16:24.946 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Thursday 16 April 2026 19:38:49 -0400 (0:00:00.251) 0:16:25.197 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Thursday 16 April 2026 19:38:49 -0400 (0:00:00.304) 0:16:25.501 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node13 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Thursday 16 April 2026 19:38:50 -0400 (0:00:00.886) 0:16:26.388 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node13 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Thursday 16 April 2026 19:38:50 -0400 (0:00:00.546) 0:16:26.935 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Thursday 16 April 2026 19:38:51 -0400 (0:00:00.215) 0:16:27.150 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Thursday 16 April 2026 19:38:51 -0400 (0:00:00.279) 0:16:27.430 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Thursday 16 April 2026 19:38:51 -0400 (0:00:00.242) 0:16:27.672 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Thursday 16 April 2026 19:38:51 -0400 (0:00:00.305) 0:16:27.978 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Thursday 16 April 2026 19:38:52 -0400 (0:00:00.203) 0:16:28.181 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Thursday 16 April 2026 19:38:52 -0400 (0:00:00.292) 0:16:28.474 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Thursday 16 April 2026 19:38:52 -0400 (0:00:00.225) 0:16:28.700 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node13 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Thursday 16 April 2026 19:38:53 -0400 (0:00:00.944) 0:16:29.645 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Thursday 16 April 2026 19:38:53 -0400 (0:00:00.234) 0:16:29.879 ******** skipping: [managed-node13] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Thursday 16 April 2026 19:38:54 -0400 (0:00:00.207) 0:16:30.087 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Thursday 16 April 2026 19:38:54 -0400 (0:00:00.190) 0:16:30.277 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Thursday 16 April 2026 19:38:54 -0400 (0:00:00.234) 0:16:30.512 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Thursday 16 April 2026 19:38:54 -0400 (0:00:00.263) 0:16:30.775 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Thursday 16 April 2026 19:38:55 -0400 (0:00:00.258) 0:16:31.034 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Thursday 16 April 2026 19:38:55 -0400 (0:00:00.245) 0:16:31.280 ******** ok: [managed-node13] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Thursday 16 April 2026 19:38:55 -0400 (0:00:00.266) 0:16:31.547 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node13 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Thursday 16 April 2026 19:38:56 -0400 (0:00:00.486) 0:16:32.042 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Thursday 16 April 2026 19:38:56 -0400 (0:00:00.341) 0:16:32.384 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node13 => (item=mount) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node13 => (item=fstab) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node13 => (item=fs) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node13 => (item=device) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node13 => (item=encryption) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node13 => (item=md) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node13 => (item=size) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node13 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Thursday 16 April 2026 19:38:58 -0400 (0:00:01.975) 0:16:34.360 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Thursday 16 April 2026 19:38:58 -0400 (0:00:00.353) 0:16:34.713 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Thursday 16 April 2026 19:38:59 -0400 (0:00:00.327) 0:16:35.040 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Thursday 16 April 2026 19:38:59 -0400 (0:00:00.466) 0:16:35.507 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Thursday 16 April 2026 19:38:59 -0400 (0:00:00.303) 0:16:35.810 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Thursday 16 April 2026 19:39:00 -0400 (0:00:00.351) 0:16:36.162 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Thursday 16 April 2026 19:39:00 -0400 (0:00:00.346) 0:16:36.508 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Thursday 16 April 2026 19:39:00 -0400 (0:00:00.331) 0:16:36.839 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Thursday 16 April 2026 19:39:01 -0400 (0:00:00.210) 0:16:37.050 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Thursday 16 April 2026 19:39:01 -0400 (0:00:00.181) 0:16:37.232 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Thursday 16 April 2026 19:39:01 -0400 (0:00:00.188) 0:16:37.420 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Thursday 16 April 2026 19:39:01 -0400 (0:00:00.311) 0:16:37.731 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Thursday 16 April 2026 19:39:02 -0400 (0:00:00.812) 0:16:38.544 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Thursday 16 April 2026 19:39:04 -0400 (0:00:01.689) 0:16:40.234 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Thursday 16 April 2026 19:39:04 -0400 (0:00:00.401) 0:16:40.635 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Thursday 16 April 2026 19:39:04 -0400 (0:00:00.176) 0:16:40.812 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Thursday 16 April 2026 19:39:05 -0400 (0:00:00.320) 0:16:41.147 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Thursday 16 April 2026 19:39:05 -0400 (0:00:00.234) 0:16:41.382 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Thursday 16 April 2026 19:39:05 -0400 (0:00:00.413) 0:16:41.796 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Thursday 16 April 2026 19:39:06 -0400 (0:00:00.479) 0:16:42.284 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382640.7818553, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1776382568.752838, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1913, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776382568.752838, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Thursday 16 April 2026 19:39:07 -0400 (0:00:01.087) 0:16:43.372 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Thursday 16 April 2026 19:39:07 -0400 (0:00:00.336) 0:16:43.709 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Thursday 16 April 2026 19:39:07 -0400 (0:00:00.234) 0:16:43.943 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Thursday 16 April 2026 19:39:08 -0400 (0:00:00.385) 0:16:44.328 ******** ok: [managed-node13] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Thursday 16 April 2026 19:39:08 -0400 (0:00:00.315) 0:16:44.644 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Thursday 16 April 2026 19:39:08 -0400 (0:00:00.297) 0:16:44.942 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Thursday 16 April 2026 19:39:09 -0400 (0:00:00.318) 0:16:45.260 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382688.5228672, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1776382568.990838, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 1961, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776382568.990838, "nlink": 1, "path": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Thursday 16 April 2026 19:39:10 -0400 (0:00:01.480) 0:16:46.740 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Thursday 16 April 2026 19:39:12 -0400 (0:00:01.654) 0:16:48.394 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.006958", "end": "2026-04-16 19:39:13.463327", "rc": 0, "start": "2026-04-16 19:39:13.456369" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 16384 MK bits: 512 MK digest: 68 99 ea 8d 5b 5e e9 da 9c 7d db 42 56 24 5d d5 78 4d 2c d1 MK salt: e3 9e 29 cf 53 6f 01 4d 79 14 e1 a8 41 f9 5e e2 81 97 d2 45 c5 13 25 e2 aa fb 0e ac b4 b0 4f 4a MK iterations: 132129 UUID: 1edafecf-4d8e-4f78-aff4-ede6c0f23479 Key Slot 0: ENABLED Iterations: 2111934 Salt: 71 1d f4 4e 82 b8 ef 6c 68 8c a7 6f c6 41 76 b8 4b 8c 5c a0 0e fb 8d e1 6a ff 1f 96 66 1a 31 d1 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Thursday 16 April 2026 19:39:13 -0400 (0:00:01.283) 0:16:49.678 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Thursday 16 April 2026 19:39:13 -0400 (0:00:00.280) 0:16:49.958 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Thursday 16 April 2026 19:39:14 -0400 (0:00:00.321) 0:16:50.280 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Thursday 16 April 2026 19:39:14 -0400 (0:00:00.342) 0:16:50.622 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Thursday 16 April 2026 19:39:14 -0400 (0:00:00.336) 0:16:50.959 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Thursday 16 April 2026 19:39:15 -0400 (0:00:00.532) 0:16:51.491 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.encryption_key_size > 0", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Thursday 16 April 2026 19:39:15 -0400 (0:00:00.206) 0:16:51.698 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Thursday 16 April 2026 19:39:16 -0400 (0:00:00.357) 0:16:52.056 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Thursday 16 April 2026 19:39:16 -0400 (0:00:00.594) 0:16:52.650 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Thursday 16 April 2026 19:39:16 -0400 (0:00:00.349) 0:16:53.000 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Thursday 16 April 2026 19:39:17 -0400 (0:00:00.305) 0:16:53.309 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Thursday 16 April 2026 19:39:17 -0400 (0:00:00.217) 0:16:53.526 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Thursday 16 April 2026 19:39:17 -0400 (0:00:00.381) 0:16:53.908 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Thursday 16 April 2026 19:39:18 -0400 (0:00:00.352) 0:16:54.261 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Thursday 16 April 2026 19:39:18 -0400 (0:00:00.267) 0:16:54.528 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Thursday 16 April 2026 19:39:18 -0400 (0:00:00.260) 0:16:54.788 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Thursday 16 April 2026 19:39:18 -0400 (0:00:00.236) 0:16:55.025 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Thursday 16 April 2026 19:39:19 -0400 (0:00:00.204) 0:16:55.229 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Thursday 16 April 2026 19:39:19 -0400 (0:00:00.290) 0:16:55.520 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Thursday 16 April 2026 19:39:19 -0400 (0:00:00.177) 0:16:55.697 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Thursday 16 April 2026 19:39:19 -0400 (0:00:00.307) 0:16:56.005 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Thursday 16 April 2026 19:39:20 -0400 (0:00:00.235) 0:16:56.241 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Thursday 16 April 2026 19:39:20 -0400 (0:00:00.243) 0:16:56.484 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Thursday 16 April 2026 19:39:20 -0400 (0:00:00.328) 0:16:56.813 ******** ok: [managed-node13] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Thursday 16 April 2026 19:39:21 -0400 (0:00:01.208) 0:16:58.022 ******** ok: [managed-node13] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Thursday 16 April 2026 19:39:23 -0400 (0:00:01.296) 0:16:59.319 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Thursday 16 April 2026 19:39:23 -0400 (0:00:00.392) 0:16:59.711 ******** ok: [managed-node13] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Thursday 16 April 2026 19:39:23 -0400 (0:00:00.312) 0:17:00.023 ******** ok: [managed-node13] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Thursday 16 April 2026 19:39:25 -0400 (0:00:01.490) 0:17:01.514 ******** skipping: [managed-node13] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Thursday 16 April 2026 19:39:25 -0400 (0:00:00.222) 0:17:01.736 ******** skipping: [managed-node13] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Thursday 16 April 2026 19:39:25 -0400 (0:00:00.246) 0:17:01.982 ******** skipping: [managed-node13] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Thursday 16 April 2026 19:39:26 -0400 (0:00:00.303) 0:17:02.286 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Thursday 16 April 2026 19:39:26 -0400 (0:00:00.460) 0:17:02.747 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Thursday 16 April 2026 19:39:27 -0400 (0:00:00.436) 0:17:03.183 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Thursday 16 April 2026 19:39:27 -0400 (0:00:00.314) 0:17:03.498 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Thursday 16 April 2026 19:39:27 -0400 (0:00:00.363) 0:17:03.885 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Thursday 16 April 2026 19:39:28 -0400 (0:00:00.328) 0:17:04.213 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Thursday 16 April 2026 19:39:28 -0400 (0:00:00.349) 0:17:04.563 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Thursday 16 April 2026 19:39:28 -0400 (0:00:00.382) 0:17:04.945 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Thursday 16 April 2026 19:39:29 -0400 (0:00:00.334) 0:17:05.279 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Thursday 16 April 2026 19:39:29 -0400 (0:00:00.327) 0:17:05.607 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Thursday 16 April 2026 19:39:29 -0400 (0:00:00.344) 0:17:05.951 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Thursday 16 April 2026 19:39:30 -0400 (0:00:00.344) 0:17:06.295 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Thursday 16 April 2026 19:39:30 -0400 (0:00:00.344) 0:17:06.640 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Thursday 16 April 2026 19:39:30 -0400 (0:00:00.251) 0:17:06.891 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Thursday 16 April 2026 19:39:31 -0400 (0:00:00.256) 0:17:07.148 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Thursday 16 April 2026 19:39:31 -0400 (0:00:00.296) 0:17:07.444 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Thursday 16 April 2026 19:39:31 -0400 (0:00:00.363) 0:17:07.808 ******** ok: [managed-node13] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Thursday 16 April 2026 19:39:32 -0400 (0:00:00.284) 0:17:08.093 ******** ok: [managed-node13] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Thursday 16 April 2026 19:39:32 -0400 (0:00:00.299) 0:17:08.393 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Thursday 16 April 2026 19:39:32 -0400 (0:00:00.451) 0:17:08.844 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.029733", "end": "2026-04-16 19:39:33.942882", "rc": 0, "start": "2026-04-16 19:39:33.913149" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Thursday 16 April 2026 19:39:34 -0400 (0:00:01.279) 0:17:10.124 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Thursday 16 April 2026 19:39:34 -0400 (0:00:00.331) 0:17:10.456 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Thursday 16 April 2026 19:39:34 -0400 (0:00:00.391) 0:17:10.847 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Thursday 16 April 2026 19:39:35 -0400 (0:00:00.232) 0:17:11.080 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Thursday 16 April 2026 19:39:35 -0400 (0:00:00.247) 0:17:11.327 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Thursday 16 April 2026 19:39:35 -0400 (0:00:00.441) 0:17:11.768 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Thursday 16 April 2026 19:39:36 -0400 (0:00:00.308) 0:17:12.076 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Thursday 16 April 2026 19:39:36 -0400 (0:00:00.289) 0:17:12.366 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Thursday 16 April 2026 19:39:36 -0400 (0:00:00.245) 0:17:12.612 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Thursday 16 April 2026 19:39:36 -0400 (0:00:00.286) 0:17:12.898 ******** changed: [managed-node13] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 5] ****************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:417 Thursday 16 April 2026 19:39:38 -0400 (0:00:01.250) 0:17:14.149 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node13 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Thursday 16 April 2026 19:39:38 -0400 (0:00:00.625) 0:17:14.774 ******** ok: [managed-node13] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Thursday 16 April 2026 19:39:39 -0400 (0:00:00.318) 0:17:15.093 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node13 TASK [Clear facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Thursday 16 April 2026 19:39:39 -0400 (0:00:00.423) 0:17:15.516 ******** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Thursday 16 April 2026 19:39:39 -0400 (0:00:00.029) 0:17:15.546 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Thursday 16 April 2026 19:39:39 -0400 (0:00:00.305) 0:17:15.851 ******** included: fedora.linux_system_roles.storage for managed-node13 TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 16 April 2026 19:39:40 -0400 (0:00:00.433) 0:17:16.285 ******** ok: [managed-node13] => { "ansible_facts": { "discovered_interpreter_python": "/usr/bin/python3.9" }, "changed": false, "message": "Message written to syslog" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:7 Thursday 16 April 2026 19:39:41 -0400 (0:00:01.503) 0:17:17.788 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 16 April 2026 19:39:42 -0400 (0:00:00.413) 0:17:18.201 ******** ok: [managed-node13] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 16 April 2026 19:39:43 -0400 (0:00:01.793) 0:17:19.995 ******** skipping: [managed-node13] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Thursday 16 April 2026 19:39:44 -0400 (0:00:00.575) 0:17:20.571 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Thursday 16 April 2026 19:39:44 -0400 (0:00:00.242) 0:17:20.814 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:10 Thursday 16 April 2026 19:39:45 -0400 (0:00:00.298) 0:17:21.113 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:14 Thursday 16 April 2026 19:39:45 -0400 (0:00:00.197) 0:17:21.310 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:18 Thursday 16 April 2026 19:39:45 -0400 (0:00:00.234) 0:17:21.545 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Thursday 16 April 2026 19:39:46 -0400 (0:00:00.802) 0:17:22.347 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Thursday 16 April 2026 19:39:46 -0400 (0:00:00.230) 0:17:22.577 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Thursday 16 April 2026 19:39:46 -0400 (0:00:00.284) 0:17:22.862 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Thursday 16 April 2026 19:39:48 -0400 (0:00:02.026) 0:17:24.889 ******** ok: [managed-node13] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Thursday 16 April 2026 19:39:49 -0400 (0:00:00.435) 0:17:25.324 ******** ok: [managed-node13] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Thursday 16 April 2026 19:39:49 -0400 (0:00:00.303) 0:17:25.627 ******** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Thursday 16 April 2026 19:39:51 -0400 (0:00:02.276) 0:17:27.904 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 16 April 2026 19:39:52 -0400 (0:00:00.450) 0:17:28.355 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 16 April 2026 19:39:52 -0400 (0:00:00.166) 0:17:28.521 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Thursday 16 April 2026 19:39:52 -0400 (0:00:00.206) 0:17:28.728 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Thursday 16 April 2026 19:39:52 -0400 (0:00:00.178) 0:17:28.906 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Thursday 16 April 2026 19:39:54 -0400 (0:00:01.916) 0:17:30.823 ******** ok: [managed-node13] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service": { "name": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Thursday 16 April 2026 19:39:58 -0400 (0:00:04.025) 0:17:34.849 ******** changed: [managed-node13] => (item=systemd-cryptsetup@luks\x2d1edafecf\x2d4d8e\x2d4f78\x2daff4\x2dede6c0f23479.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "name": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket \"dev-mapper-foo\\\\x2dtest1.device\" systemd-udevd-kernel.socket cryptsetup-pre.target \"system-systemd\\\\x2dcryptsetup.slice\"", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target \"blockdev@dev-mapper-luks\\\\x2d1edafecf\\\\x2d4d8e\\\\x2d4f78\\\\x2daff4\\\\x2dede6c0f23479.target\" cryptsetup.target", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479 /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13683", "LimitNPROCSoft": "13683", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13683", "LimitSIGPENDINGSoft": "13683", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d1edafecf\\\\x2d4d8e\\\\x2d4f78\\\\x2daff4\\\\x2dede6c0f23479.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target \"dev-mapper-luks\\\\x2d1edafecf\\\\x2d4d8e\\\\x2d4f78\\\\x2daff4\\\\x2dede6c0f23479.device\"", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2026-04-16 19:38:12 EDT", "StateChangeTimestampMonotonic": "2502595840", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21893", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d1edafecf\\\\x2d4d8e\\\\x2d4f78\\\\x2daff4\\\\x2dede6c0f23479.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Thursday 16 April 2026 19:40:00 -0400 (0:00:01.852) 0:17:36.702 ******** fatal: [managed-node13]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Thursday 16 April 2026 19:40:03 -0400 (0:00:02.541) 0:17:39.243 ******** fatal: [managed-node13]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479' in safe mode due to encryption removal", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Thursday 16 April 2026 19:40:03 -0400 (0:00:00.192) 0:17:39.435 ******** changed: [managed-node13] => (item=systemd-cryptsetup@luks\x2d1edafecf\x2d4d8e\x2d4f78\x2daff4\x2dede6c0f23479.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "name": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454168576", "LimitMEMLOCKSoft": "454168576", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13683", "LimitNPROCSoft": "13683", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13683", "LimitSIGPENDINGSoft": "13683", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d1edafecf\\\\x2d4d8e\\\\x2d4f78\\\\x2daff4\\\\x2dede6c0f23479.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "\"dev-mapper-luks\\\\x2d1edafecf\\\\x2d4d8e\\\\x2d4f78\\\\x2daff4\\\\x2dede6c0f23479.device\" cryptsetup.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2026-04-16 19:38:12 EDT", "StateChangeTimestampMonotonic": "2502595840", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21893", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Thursday 16 April 2026 19:40:05 -0400 (0:00:01.744) 0:17:41.180 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Thursday 16 April 2026 19:40:05 -0400 (0:00:00.329) 0:17:41.510 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Thursday 16 April 2026 19:40:05 -0400 (0:00:00.430) 0:17:41.941 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Thursday 16 April 2026 19:40:06 -0400 (0:00:00.320) 0:17:42.262 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382777.8848894, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776382777.8848894, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776382777.8848894, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "2952465409", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Thursday 16 April 2026 19:40:07 -0400 (0:00:01.184) 0:17:43.446 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer - 3] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:440 Thursday 16 April 2026 19:40:07 -0400 (0:00:00.203) 0:17:43.649 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node13 TASK [Clear facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Thursday 16 April 2026 19:40:08 -0400 (0:00:00.402) 0:17:44.052 ******** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Thursday 16 April 2026 19:40:08 -0400 (0:00:00.008) 0:17:44.061 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Thursday 16 April 2026 19:40:08 -0400 (0:00:00.233) 0:17:44.294 ******** included: fedora.linux_system_roles.storage for managed-node13 TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 16 April 2026 19:40:08 -0400 (0:00:00.462) 0:17:44.757 ******** ok: [managed-node13] => { "ansible_facts": { "discovered_interpreter_python": "/usr/bin/python3.9" }, "changed": false, "message": "Message written to syslog" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:7 Thursday 16 April 2026 19:40:10 -0400 (0:00:01.588) 0:17:46.346 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 16 April 2026 19:40:10 -0400 (0:00:00.353) 0:17:46.699 ******** ok: [managed-node13] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 16 April 2026 19:40:12 -0400 (0:00:01.939) 0:17:48.638 ******** skipping: [managed-node13] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Thursday 16 April 2026 19:40:13 -0400 (0:00:00.865) 0:17:49.504 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Thursday 16 April 2026 19:40:13 -0400 (0:00:00.319) 0:17:49.824 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:10 Thursday 16 April 2026 19:40:14 -0400 (0:00:00.249) 0:17:50.073 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:14 Thursday 16 April 2026 19:40:14 -0400 (0:00:00.246) 0:17:50.319 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:18 Thursday 16 April 2026 19:40:14 -0400 (0:00:00.235) 0:17:50.555 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Thursday 16 April 2026 19:40:15 -0400 (0:00:00.861) 0:17:51.417 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Thursday 16 April 2026 19:40:15 -0400 (0:00:00.282) 0:17:51.699 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Thursday 16 April 2026 19:40:15 -0400 (0:00:00.208) 0:17:51.908 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Thursday 16 April 2026 19:40:17 -0400 (0:00:02.030) 0:17:53.939 ******** ok: [managed-node13] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Thursday 16 April 2026 19:40:18 -0400 (0:00:00.285) 0:17:54.224 ******** ok: [managed-node13] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Thursday 16 April 2026 19:40:18 -0400 (0:00:00.290) 0:17:54.514 ******** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Thursday 16 April 2026 19:40:20 -0400 (0:00:02.392) 0:17:56.907 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 16 April 2026 19:40:21 -0400 (0:00:00.493) 0:17:57.400 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 16 April 2026 19:40:21 -0400 (0:00:00.174) 0:17:57.575 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Thursday 16 April 2026 19:40:21 -0400 (0:00:00.112) 0:17:57.688 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Thursday 16 April 2026 19:40:21 -0400 (0:00:00.124) 0:17:57.812 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Thursday 16 April 2026 19:40:23 -0400 (0:00:01.699) 0:17:59.512 ******** ok: [managed-node13] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service": { "name": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Thursday 16 April 2026 19:40:26 -0400 (0:00:02.941) 0:18:02.453 ******** changed: [managed-node13] => (item=systemd-cryptsetup@luks\x2d1edafecf\x2d4d8e\x2d4f78\x2daff4\x2dede6c0f23479.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "name": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket \"system-systemd\\\\x2dcryptsetup.slice\" \"dev-mapper-foo\\\\x2dtest1.device\" cryptsetup-pre.target systemd-udevd-kernel.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target \"blockdev@dev-mapper-luks\\\\x2d1edafecf\\\\x2d4d8e\\\\x2d4f78\\\\x2daff4\\\\x2dede6c0f23479.target\" cryptsetup.target", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479 /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13683", "LimitNPROCSoft": "13683", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13683", "LimitSIGPENDINGSoft": "13683", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d1edafecf\\\\x2d4d8e\\\\x2d4f78\\\\x2daff4\\\\x2dede6c0f23479.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "\"dev-mapper-luks\\\\x2d1edafecf\\\\x2d4d8e\\\\x2d4f78\\\\x2daff4\\\\x2dede6c0f23479.device\" cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2026-04-16 19:38:12 EDT", "StateChangeTimestampMonotonic": "2502595840", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21893", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d1edafecf\\\\x2d4d8e\\\\x2d4f78\\\\x2daff4\\\\x2dede6c0f23479.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Thursday 16 April 2026 19:40:28 -0400 (0:00:02.049) 0:18:04.502 ******** changed: [managed-node13] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Thursday 16 April 2026 19:40:31 -0400 (0:00:03.130) 0:18:07.632 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Thursday 16 April 2026 19:40:31 -0400 (0:00:00.273) 0:18:07.906 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382577.73084, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "626483cc4af1733e5b9d55fb82a59a61ed6ea3de", "ctime": 1776382577.72684, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 218104009, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776382577.72684, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1490, "uid": 0, "version": "292099060", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Thursday 16 April 2026 19:40:33 -0400 (0:00:01.245) 0:18:09.151 ******** ok: [managed-node13] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Thursday 16 April 2026 19:40:34 -0400 (0:00:01.462) 0:18:10.614 ******** changed: [managed-node13] => (item=systemd-cryptsetup@luks\x2d1edafecf\x2d4d8e\x2d4f78\x2daff4\x2dede6c0f23479.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "name": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454168576", "LimitMEMLOCKSoft": "454168576", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13683", "LimitNPROCSoft": "13683", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13683", "LimitSIGPENDINGSoft": "13683", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d1edafecf\\\\x2d4d8e\\\\x2d4f78\\\\x2daff4\\\\x2dede6c0f23479.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "RequiredBy": "\"dev-mapper-luks\\\\x2d1edafecf\\\\x2d4d8e\\\\x2d4f78\\\\x2daff4\\\\x2dede6c0f23479.device\" cryptsetup.target", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2026-04-16 19:38:12 EDT", "StateChangeTimestampMonotonic": "2502595840", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21893", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Thursday 16 April 2026 19:40:36 -0400 (0:00:01.984) 0:18:12.598 ******** ok: [managed-node13] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/mapper/foo-test1", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 16 April 2026 19:40:36 -0400 (0:00:00.383) 0:18:12.981 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Thursday 16 April 2026 19:40:37 -0400 (0:00:00.421) 0:18:13.403 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Thursday 16 April 2026 19:40:37 -0400 (0:00:00.252) 0:18:13.655 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node13] => (item={'src': '/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Thursday 16 April 2026 19:40:39 -0400 (0:00:01.376) 0:18:15.032 ******** ok: [managed-node13] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Thursday 16 April 2026 19:40:40 -0400 (0:00:01.541) 0:18:16.574 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node13] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Thursday 16 April 2026 19:40:42 -0400 (0:00:01.765) 0:18:18.340 ******** skipping: [managed-node13] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Thursday 16 April 2026 19:40:42 -0400 (0:00:00.494) 0:18:18.834 ******** ok: [managed-node13] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Thursday 16 April 2026 19:40:44 -0400 (0:00:01.675) 0:18:20.510 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382590.507843, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "4b23f0039fa353d905d020ad1b0d9e45dc1129c7", "ctime": 1776382583.3958414, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 666894680, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776382583.39697, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "3986624464", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Thursday 16 April 2026 19:40:45 -0400 (0:00:01.339) 0:18:21.849 ******** changed: [managed-node13] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Thursday 16 April 2026 19:40:47 -0400 (0:00:01.427) 0:18:23.276 ******** ok: [managed-node13] TASK [Verify role results - 9] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:455 Thursday 16 April 2026 19:40:49 -0400 (0:00:01.803) 0:18:25.080 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node13 TASK [Print out pool information] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Thursday 16 April 2026 19:40:49 -0400 (0:00:00.621) 0:18:25.702 ******** ok: [managed-node13] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Thursday 16 April 2026 19:40:50 -0400 (0:00:00.437) 0:18:26.139 ******** skipping: [managed-node13] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Thursday 16 April 2026 19:40:50 -0400 (0:00:00.217) 0:18:26.357 ******** ok: [managed-node13] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "f559d3bf-7d0f-4775-8a0b-9455d9f66ec1" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "CeOF2F-JH5x-FPcS-IIeE-QvP6-4d1m-bl1dga" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fb6c01b6-de24-4b5b-a70c-3231ceae3fbd" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Thursday 16 April 2026 19:40:51 -0400 (0:00:01.185) 0:18:27.542 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003024", "end": "2026-04-16 19:40:52.523524", "rc": 0, "start": "2026-04-16 19:40:52.520500" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Apr 14 06:59:53 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fb6c01b6-de24-4b5b-a70c-3231ceae3fbd / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Thursday 16 April 2026 19:40:52 -0400 (0:00:01.159) 0:18:28.702 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003595", "end": "2026-04-16 19:40:53.702180", "failed_when_result": false, "rc": 0, "start": "2026-04-16 19:40:53.698585" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Thursday 16 April 2026 19:40:53 -0400 (0:00:01.191) 0:18:29.893 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node13 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Thursday 16 April 2026 19:40:54 -0400 (0:00:00.412) 0:18:30.306 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Thursday 16 April 2026 19:40:54 -0400 (0:00:00.303) 0:18:30.609 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.032376", "end": "2026-04-16 19:40:55.830013", "rc": 0, "start": "2026-04-16 19:40:55.797637" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Thursday 16 April 2026 19:40:56 -0400 (0:00:01.449) 0:18:32.058 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Thursday 16 April 2026 19:40:56 -0400 (0:00:00.317) 0:18:32.376 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node13 => (item=members) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node13 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Thursday 16 April 2026 19:40:58 -0400 (0:00:02.191) 0:18:34.567 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Thursday 16 April 2026 19:40:59 -0400 (0:00:00.482) 0:18:35.050 ******** ok: [managed-node13] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Thursday 16 April 2026 19:41:00 -0400 (0:00:01.450) 0:18:36.500 ******** ok: [managed-node13] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Thursday 16 April 2026 19:41:00 -0400 (0:00:00.356) 0:18:36.856 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Thursday 16 April 2026 19:41:01 -0400 (0:00:00.480) 0:18:37.336 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Thursday 16 April 2026 19:41:01 -0400 (0:00:00.317) 0:18:37.654 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Thursday 16 April 2026 19:41:01 -0400 (0:00:00.306) 0:18:37.960 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Thursday 16 April 2026 19:41:02 -0400 (0:00:00.393) 0:18:38.353 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Thursday 16 April 2026 19:41:02 -0400 (0:00:00.339) 0:18:38.693 ******** ok: [managed-node13] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Thursday 16 April 2026 19:41:03 -0400 (0:00:00.423) 0:18:39.117 ******** ok: [managed-node13] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:117848): WARNING **: 19:41:04.297: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.5 27 Jan 2026 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.43.82 originally 10.31.43.82 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.43.82 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.43.82 originally 10.31.43.82 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/983a9e969b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.43.82 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Thursday 16 April 2026 19:41:04 -0400 (0:00:01.510) 0:18:40.628 ******** skipping: [managed-node13] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "storage_test_pool.grow_to_fill | bool", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Thursday 16 April 2026 19:41:04 -0400 (0:00:00.316) 0:18:40.945 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node13 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Thursday 16 April 2026 19:41:05 -0400 (0:00:00.606) 0:18:41.551 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Thursday 16 April 2026 19:41:05 -0400 (0:00:00.221) 0:18:41.773 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Thursday 16 April 2026 19:41:05 -0400 (0:00:00.229) 0:18:42.003 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Thursday 16 April 2026 19:41:06 -0400 (0:00:00.222) 0:18:42.226 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Thursday 16 April 2026 19:41:06 -0400 (0:00:00.200) 0:18:42.426 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Thursday 16 April 2026 19:41:06 -0400 (0:00:00.182) 0:18:42.608 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Thursday 16 April 2026 19:41:06 -0400 (0:00:00.181) 0:18:42.790 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Thursday 16 April 2026 19:41:07 -0400 (0:00:00.245) 0:18:43.035 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Thursday 16 April 2026 19:41:07 -0400 (0:00:00.257) 0:18:43.292 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Thursday 16 April 2026 19:41:07 -0400 (0:00:00.247) 0:18:43.540 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Thursday 16 April 2026 19:41:07 -0400 (0:00:00.214) 0:18:43.755 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Thursday 16 April 2026 19:41:07 -0400 (0:00:00.262) 0:18:44.018 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node13 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Thursday 16 April 2026 19:41:08 -0400 (0:00:00.625) 0:18:44.673 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node13 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Thursday 16 April 2026 19:41:09 -0400 (0:00:00.485) 0:18:45.158 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Thursday 16 April 2026 19:41:09 -0400 (0:00:00.305) 0:18:45.464 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Thursday 16 April 2026 19:41:09 -0400 (0:00:00.284) 0:18:45.748 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Thursday 16 April 2026 19:41:10 -0400 (0:00:00.472) 0:18:46.221 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Thursday 16 April 2026 19:41:10 -0400 (0:00:00.313) 0:18:46.535 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Thursday 16 April 2026 19:41:10 -0400 (0:00:00.279) 0:18:46.814 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Thursday 16 April 2026 19:41:11 -0400 (0:00:00.335) 0:18:47.150 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Thursday 16 April 2026 19:41:11 -0400 (0:00:00.286) 0:18:47.437 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node13 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Thursday 16 April 2026 19:41:12 -0400 (0:00:00.704) 0:18:48.141 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node13 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Thursday 16 April 2026 19:41:12 -0400 (0:00:00.482) 0:18:48.624 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Thursday 16 April 2026 19:41:12 -0400 (0:00:00.203) 0:18:48.827 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Thursday 16 April 2026 19:41:13 -0400 (0:00:00.235) 0:18:49.062 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Thursday 16 April 2026 19:41:13 -0400 (0:00:00.288) 0:18:49.351 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Thursday 16 April 2026 19:41:13 -0400 (0:00:00.189) 0:18:49.541 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node13 TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Thursday 16 April 2026 19:41:14 -0400 (0:00:00.730) 0:18:50.271 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Thursday 16 April 2026 19:41:14 -0400 (0:00:00.304) 0:18:50.575 ******** skipping: [managed-node13] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Thursday 16 April 2026 19:41:14 -0400 (0:00:00.309) 0:18:50.885 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node13 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Thursday 16 April 2026 19:41:15 -0400 (0:00:00.446) 0:18:51.331 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Thursday 16 April 2026 19:41:15 -0400 (0:00:00.339) 0:18:51.705 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Thursday 16 April 2026 19:41:16 -0400 (0:00:00.347) 0:18:52.053 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Thursday 16 April 2026 19:41:16 -0400 (0:00:00.281) 0:18:52.334 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Thursday 16 April 2026 19:41:16 -0400 (0:00:00.228) 0:18:52.563 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Thursday 16 April 2026 19:41:16 -0400 (0:00:00.263) 0:18:52.826 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Thursday 16 April 2026 19:41:17 -0400 (0:00:00.294) 0:18:53.120 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Thursday 16 April 2026 19:41:17 -0400 (0:00:00.237) 0:18:53.358 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node13 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Thursday 16 April 2026 19:41:17 -0400 (0:00:00.627) 0:18:53.986 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node13 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Thursday 16 April 2026 19:41:18 -0400 (0:00:00.535) 0:18:54.521 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Thursday 16 April 2026 19:41:18 -0400 (0:00:00.261) 0:18:54.783 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Thursday 16 April 2026 19:41:18 -0400 (0:00:00.216) 0:18:55.000 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Thursday 16 April 2026 19:41:19 -0400 (0:00:00.251) 0:18:55.251 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Thursday 16 April 2026 19:41:19 -0400 (0:00:00.243) 0:18:55.495 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Thursday 16 April 2026 19:41:19 -0400 (0:00:00.224) 0:18:55.720 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Thursday 16 April 2026 19:41:19 -0400 (0:00:00.212) 0:18:55.932 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Thursday 16 April 2026 19:41:20 -0400 (0:00:00.233) 0:18:56.166 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node13 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Thursday 16 April 2026 19:41:21 -0400 (0:00:00.884) 0:18:57.051 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Thursday 16 April 2026 19:41:21 -0400 (0:00:00.216) 0:18:57.267 ******** skipping: [managed-node13] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Thursday 16 April 2026 19:41:21 -0400 (0:00:00.227) 0:18:57.495 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Thursday 16 April 2026 19:41:21 -0400 (0:00:00.252) 0:18:57.747 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Thursday 16 April 2026 19:41:21 -0400 (0:00:00.256) 0:18:58.005 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Thursday 16 April 2026 19:41:22 -0400 (0:00:00.274) 0:18:58.279 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Thursday 16 April 2026 19:41:22 -0400 (0:00:00.223) 0:18:58.503 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Thursday 16 April 2026 19:41:22 -0400 (0:00:00.227) 0:18:58.730 ******** ok: [managed-node13] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Thursday 16 April 2026 19:41:22 -0400 (0:00:00.241) 0:18:58.972 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node13 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': 0, 'encryption_luks_version': 'luks1', 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/foo-test1', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/foo-test1', '_kernel_device': '/dev/dm-0', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Thursday 16 April 2026 19:41:23 -0400 (0:00:00.431) 0:18:59.404 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Thursday 16 April 2026 19:41:23 -0400 (0:00:00.381) 0:18:59.785 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node13 => (item=mount) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node13 => (item=fstab) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node13 => (item=fs) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node13 => (item=device) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node13 => (item=encryption) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node13 => (item=md) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node13 => (item=size) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node13 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Thursday 16 April 2026 19:41:25 -0400 (0:00:01.960) 0:19:01.746 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Thursday 16 April 2026 19:41:26 -0400 (0:00:00.393) 0:19:02.139 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Thursday 16 April 2026 19:41:26 -0400 (0:00:00.322) 0:19:02.487 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Thursday 16 April 2026 19:41:26 -0400 (0:00:00.389) 0:19:02.877 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Thursday 16 April 2026 19:41:27 -0400 (0:00:00.299) 0:19:03.176 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Thursday 16 April 2026 19:41:27 -0400 (0:00:00.414) 0:19:03.590 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Thursday 16 April 2026 19:41:27 -0400 (0:00:00.438) 0:19:04.029 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Thursday 16 April 2026 19:41:28 -0400 (0:00:00.368) 0:19:04.397 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Thursday 16 April 2026 19:41:28 -0400 (0:00:00.222) 0:19:04.620 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Thursday 16 April 2026 19:41:28 -0400 (0:00:00.146) 0:19:04.767 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Thursday 16 April 2026 19:41:28 -0400 (0:00:00.194) 0:19:05.003 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Thursday 16 April 2026 19:41:29 -0400 (0:00:00.268) 0:19:05.272 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Thursday 16 April 2026 19:41:29 -0400 (0:00:00.668) 0:19:05.940 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Thursday 16 April 2026 19:41:30 -0400 (0:00:00.356) 0:19:06.297 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Thursday 16 April 2026 19:41:30 -0400 (0:00:00.324) 0:19:06.622 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Thursday 16 April 2026 19:41:30 -0400 (0:00:00.229) 0:19:06.851 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Thursday 16 April 2026 19:41:31 -0400 (0:00:00.377) 0:19:07.229 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Thursday 16 April 2026 19:41:31 -0400 (0:00:00.269) 0:19:07.498 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Thursday 16 April 2026 19:41:31 -0400 (0:00:00.368) 0:19:07.867 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Thursday 16 April 2026 19:41:32 -0400 (0:00:00.437) 0:19:08.305 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382831.3799028, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1776382831.3799028, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2049, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776382831.3799028, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Thursday 16 April 2026 19:41:33 -0400 (0:00:01.283) 0:19:09.589 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Thursday 16 April 2026 19:41:33 -0400 (0:00:00.341) 0:19:09.930 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Thursday 16 April 2026 19:41:34 -0400 (0:00:00.222) 0:19:10.152 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Thursday 16 April 2026 19:41:34 -0400 (0:00:00.315) 0:19:10.468 ******** ok: [managed-node13] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Thursday 16 April 2026 19:41:34 -0400 (0:00:00.302) 0:19:10.770 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Thursday 16 April 2026 19:41:34 -0400 (0:00:00.217) 0:19:10.987 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Thursday 16 April 2026 19:41:35 -0400 (0:00:00.371) 0:19:11.359 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Thursday 16 April 2026 19:41:35 -0400 (0:00:00.218) 0:19:11.577 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Thursday 16 April 2026 19:41:37 -0400 (0:00:01.955) 0:19:13.533 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Thursday 16 April 2026 19:41:37 -0400 (0:00:00.176) 0:19:13.709 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Thursday 16 April 2026 19:41:37 -0400 (0:00:00.239) 0:19:13.963 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Thursday 16 April 2026 19:41:38 -0400 (0:00:00.414) 0:19:14.377 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Thursday 16 April 2026 19:41:38 -0400 (0:00:00.240) 0:19:14.617 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Thursday 16 April 2026 19:41:38 -0400 (0:00:00.239) 0:19:14.857 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Thursday 16 April 2026 19:41:39 -0400 (0:00:00.221) 0:19:15.078 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Thursday 16 April 2026 19:41:39 -0400 (0:00:00.249) 0:19:15.328 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Thursday 16 April 2026 19:41:39 -0400 (0:00:00.174) 0:19:15.502 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Thursday 16 April 2026 19:41:39 -0400 (0:00:00.395) 0:19:15.898 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Thursday 16 April 2026 19:41:40 -0400 (0:00:00.381) 0:19:16.279 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Thursday 16 April 2026 19:41:40 -0400 (0:00:00.233) 0:19:16.513 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Thursday 16 April 2026 19:41:40 -0400 (0:00:00.158) 0:19:16.671 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Thursday 16 April 2026 19:41:40 -0400 (0:00:00.201) 0:19:16.873 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Thursday 16 April 2026 19:41:41 -0400 (0:00:00.206) 0:19:17.079 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Thursday 16 April 2026 19:41:41 -0400 (0:00:00.238) 0:19:17.317 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Thursday 16 April 2026 19:41:41 -0400 (0:00:00.197) 0:19:17.515 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Thursday 16 April 2026 19:41:41 -0400 (0:00:00.302) 0:19:17.817 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Thursday 16 April 2026 19:41:41 -0400 (0:00:00.199) 0:19:18.017 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Thursday 16 April 2026 19:41:42 -0400 (0:00:00.273) 0:19:18.290 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Thursday 16 April 2026 19:41:42 -0400 (0:00:00.243) 0:19:18.534 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Thursday 16 April 2026 19:41:42 -0400 (0:00:00.277) 0:19:18.812 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Thursday 16 April 2026 19:41:43 -0400 (0:00:00.240) 0:19:19.052 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Thursday 16 April 2026 19:41:43 -0400 (0:00:00.257) 0:19:19.321 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Thursday 16 April 2026 19:41:43 -0400 (0:00:00.199) 0:19:19.521 ******** ok: [managed-node13] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Thursday 16 April 2026 19:41:44 -0400 (0:00:01.329) 0:19:20.850 ******** ok: [managed-node13] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Thursday 16 April 2026 19:41:45 -0400 (0:00:01.079) 0:19:21.930 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Thursday 16 April 2026 19:41:46 -0400 (0:00:00.420) 0:19:22.351 ******** ok: [managed-node13] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Thursday 16 April 2026 19:41:46 -0400 (0:00:00.243) 0:19:22.595 ******** ok: [managed-node13] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Thursday 16 April 2026 19:41:47 -0400 (0:00:01.140) 0:19:23.736 ******** skipping: [managed-node13] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Thursday 16 April 2026 19:41:47 -0400 (0:00:00.275) 0:19:24.011 ******** skipping: [managed-node13] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Thursday 16 April 2026 19:41:48 -0400 (0:00:00.251) 0:19:24.262 ******** skipping: [managed-node13] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Thursday 16 April 2026 19:41:48 -0400 (0:00:00.187) 0:19:24.450 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Thursday 16 April 2026 19:41:48 -0400 (0:00:00.259) 0:19:24.710 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Thursday 16 April 2026 19:41:49 -0400 (0:00:00.352) 0:19:25.062 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Thursday 16 April 2026 19:41:49 -0400 (0:00:00.337) 0:19:25.400 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Thursday 16 April 2026 19:41:49 -0400 (0:00:00.205) 0:19:25.605 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Thursday 16 April 2026 19:41:49 -0400 (0:00:00.259) 0:19:25.865 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Thursday 16 April 2026 19:41:50 -0400 (0:00:00.365) 0:19:26.230 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Thursday 16 April 2026 19:41:50 -0400 (0:00:00.353) 0:19:26.584 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Thursday 16 April 2026 19:41:50 -0400 (0:00:00.312) 0:19:26.897 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Thursday 16 April 2026 19:41:51 -0400 (0:00:00.326) 0:19:27.223 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Thursday 16 April 2026 19:41:51 -0400 (0:00:00.276) 0:19:27.500 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Thursday 16 April 2026 19:41:51 -0400 (0:00:00.296) 0:19:27.797 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Thursday 16 April 2026 19:41:52 -0400 (0:00:00.241) 0:19:28.038 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Thursday 16 April 2026 19:41:52 -0400 (0:00:00.343) 0:19:28.382 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Thursday 16 April 2026 19:41:52 -0400 (0:00:00.295) 0:19:28.678 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Thursday 16 April 2026 19:41:52 -0400 (0:00:00.317) 0:19:28.995 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Thursday 16 April 2026 19:41:53 -0400 (0:00:00.326) 0:19:29.321 ******** ok: [managed-node13] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Thursday 16 April 2026 19:41:53 -0400 (0:00:00.297) 0:19:29.619 ******** ok: [managed-node13] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Thursday 16 April 2026 19:41:53 -0400 (0:00:00.264) 0:19:29.884 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Thursday 16 April 2026 19:41:54 -0400 (0:00:00.392) 0:19:30.277 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.030628", "end": "2026-04-16 19:41:55.296907", "rc": 0, "start": "2026-04-16 19:41:55.266279" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Thursday 16 April 2026 19:41:55 -0400 (0:00:01.182) 0:19:31.459 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Thursday 16 April 2026 19:41:55 -0400 (0:00:00.302) 0:19:31.761 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Thursday 16 April 2026 19:41:56 -0400 (0:00:00.329) 0:19:32.091 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Thursday 16 April 2026 19:41:56 -0400 (0:00:00.232) 0:19:32.324 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Thursday 16 April 2026 19:41:56 -0400 (0:00:00.165) 0:19:32.490 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Thursday 16 April 2026 19:41:56 -0400 (0:00:00.245) 0:19:32.735 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Thursday 16 April 2026 19:41:56 -0400 (0:00:00.235) 0:19:32.971 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Thursday 16 April 2026 19:41:57 -0400 (0:00:00.254) 0:19:33.245 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Thursday 16 April 2026 19:41:57 -0400 (0:00:00.247) 0:19:33.492 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Thursday 16 April 2026 19:41:57 -0400 (0:00:00.230) 0:19:33.725 ******** changed: [managed-node13] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode - 6] ****************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:461 Thursday 16 April 2026 19:41:58 -0400 (0:00:01.084) 0:19:34.809 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node13 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Thursday 16 April 2026 19:41:59 -0400 (0:00:00.701) 0:19:35.511 ******** ok: [managed-node13] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error - 2] ************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Thursday 16 April 2026 19:41:59 -0400 (0:00:00.355) 0:19:35.867 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node13 TASK [Clear facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Thursday 16 April 2026 19:42:00 -0400 (0:00:00.298) 0:19:36.165 ******** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Thursday 16 April 2026 19:42:00 -0400 (0:00:00.033) 0:19:36.199 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Thursday 16 April 2026 19:42:00 -0400 (0:00:00.238) 0:19:36.438 ******** included: fedora.linux_system_roles.storage for managed-node13 TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 16 April 2026 19:42:00 -0400 (0:00:00.363) 0:19:36.801 ******** ok: [managed-node13] => { "ansible_facts": { "discovered_interpreter_python": "/usr/bin/python3.9" }, "changed": false, "message": "Message written to syslog" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:7 Thursday 16 April 2026 19:42:02 -0400 (0:00:01.471) 0:19:38.276 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 16 April 2026 19:42:02 -0400 (0:00:00.268) 0:19:38.545 ******** ok: [managed-node13] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 16 April 2026 19:42:04 -0400 (0:00:01.735) 0:19:40.281 ******** skipping: [managed-node13] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Thursday 16 April 2026 19:42:05 -0400 (0:00:00.772) 0:19:41.053 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Thursday 16 April 2026 19:42:05 -0400 (0:00:00.253) 0:19:41.306 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:10 Thursday 16 April 2026 19:42:05 -0400 (0:00:00.140) 0:19:41.446 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:14 Thursday 16 April 2026 19:42:05 -0400 (0:00:00.219) 0:19:41.665 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:18 Thursday 16 April 2026 19:42:05 -0400 (0:00:00.159) 0:19:41.825 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Thursday 16 April 2026 19:42:06 -0400 (0:00:00.670) 0:19:42.496 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Thursday 16 April 2026 19:42:06 -0400 (0:00:00.274) 0:19:42.771 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Thursday 16 April 2026 19:42:07 -0400 (0:00:00.285) 0:19:43.057 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Thursday 16 April 2026 19:42:08 -0400 (0:00:01.895) 0:19:44.952 ******** ok: [managed-node13] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Thursday 16 April 2026 19:42:09 -0400 (0:00:00.306) 0:19:45.259 ******** ok: [managed-node13] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Thursday 16 April 2026 19:42:09 -0400 (0:00:00.278) 0:19:45.537 ******** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Thursday 16 April 2026 19:42:11 -0400 (0:00:02.097) 0:19:47.635 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 16 April 2026 19:42:12 -0400 (0:00:00.487) 0:19:48.123 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 16 April 2026 19:42:12 -0400 (0:00:00.195) 0:19:48.319 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Thursday 16 April 2026 19:42:12 -0400 (0:00:00.199) 0:19:48.518 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Thursday 16 April 2026 19:42:12 -0400 (0:00:00.205) 0:19:48.724 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Thursday 16 April 2026 19:42:14 -0400 (0:00:01.831) 0:19:50.556 ******** ok: [managed-node13] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service": { "name": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "source": "systemd", "state": "stopped", "status": "generated" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Thursday 16 April 2026 19:42:19 -0400 (0:00:04.514) 0:19:55.070 ******** changed: [managed-node13] => (item=systemd-cryptsetup@luks\x2d1edafecf\x2d4d8e\x2d4f78\x2daff4\x2dede6c0f23479.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "name": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "status": { "AccessSELinuxContext": "system_u:object_r:systemd_unit_file_t:s0", "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "\"dev-mapper-foo\\\\x2dtest1.device\" systemd-journald.socket \"system-systemd\\\\x2dcryptsetup.slice\" cryptsetup-pre.target systemd-udevd-kernel.socket", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target \"blockdev@dev-mapper-luks\\\\x2d1edafecf\\\\x2d4d8e\\\\x2d4f78\\\\x2daff4\\\\x2dede6c0f23479.target\" umount.target", "BindsTo": "\"dev-mapper-foo\\\\x2dtest1.device\"", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "Conflicts": "umount.target", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "no", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "Cryptography Setup for luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479", "DevicePolicy": "auto", "Documentation": "\"man:crypttab(5)\" \"man:systemd-cryptsetup-generator(8)\" \"man:systemd-cryptsetup@.service(8)\"", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479 /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479 /dev/mapper/foo-test1 - ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStopEx": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "IgnoreOnIsolate": "yes", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "shared", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "524288", "LimitNOFILESoft": "1024", "LimitNPROC": "13683", "LimitNPROCSoft": "13683", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13683", "LimitSIGPENDINGSoft": "13683", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadState": "loaded", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d1edafecf\\\\x2d4d8e\\\\x2d4f78\\\\x2daff4\\\\x2dede6c0f23479.service\"", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMPolicy": "stop", "OOMScoreAdjust": "500", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "yes", "RemoveIPC": "no", "RequiredBy": "cryptsetup.target", "Requires": "\"system-systemd\\\\x2dcryptsetup.slice\"", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestamp": "Thu 2026-04-16 19:38:12 EDT", "StateChangeTimestampMonotonic": "2502595840", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21893", "TimeoutAbortUSec": "infinity", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "infinity", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "infinity", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "generated", "UtmpMode": "init", "Wants": "\"blockdev@dev-mapper-luks\\\\x2d1edafecf\\\\x2d4d8e\\\\x2d4f78\\\\x2daff4\\\\x2dede6c0f23479.target\"", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Thursday 16 April 2026 19:42:20 -0400 (0:00:01.383) 0:19:56.453 ******** fatal: [managed-node13]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129 Thursday 16 April 2026 19:42:22 -0400 (0:00:02.064) 0:19:58.518 ******** fatal: [managed-node13]: FAILED! => { "changed": false } MSG: {'changed': False, 'actions': [], 'leaves': [], 'mounts': [], 'crypts': [], 'pools': [], 'volumes': [], 'packages': [], 'failed': True, 'msg': "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", 'invocation': {'module_args': {'pools': [{'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None}]}], 'volumes': [], 'use_partitions': None, 'disklabel_type': None, 'pool_defaults': {'state': 'present', 'type': 'lvm', 'disks': [], 'volumes': [], 'grow_to_fill': False, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, 'shared': False}, 'volume_defaults': {'state': 'present', 'type': 'lvm', 'size': 0, 'disks': [], 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_stripe_size': None, 'raid_metadata_version': None, 'encryption': False, 'encryption_password': None, 'encryption_key': None, 'encryption_cipher': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': []}, 'safe_mode': True, 'uses_kmod_kvdo': True, 'packages_only': False, 'diskvolume_mkfs_option_map': {}}}, '_ansible_no_log': False} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Thursday 16 April 2026 19:42:22 -0400 (0:00:00.360) 0:19:58.879 ******** changed: [managed-node13] => (item=systemd-cryptsetup@luks\x2d1edafecf\x2d4d8e\x2d4f78\x2daff4\x2dede6c0f23479.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "name": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "CPUAccounting": "yes", "CPUAffinityFromNUMA": "no", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "[not set]", "CPUUsageNSec": "[not set]", "CPUWeight": "[not set]", "CacheDirectoryMode": "0755", "CanFreeze": "yes", "CanIsolate": "no", "CanReload": "no", "CanStart": "no", "CanStop": "yes", "CapabilityBoundingSet": "cap_chown cap_dac_override cap_dac_read_search cap_fowner cap_fsetid cap_kill cap_setgid cap_setuid cap_setpcap cap_linux_immutable cap_net_bind_service cap_net_broadcast cap_net_admin cap_net_raw cap_ipc_lock cap_ipc_owner cap_sys_module cap_sys_rawio cap_sys_chroot cap_sys_ptrace cap_sys_pacct cap_sys_admin cap_sys_boot cap_sys_nice cap_sys_resource cap_sys_time cap_sys_tty_config cap_mknod cap_lease cap_audit_write cap_audit_control cap_setfcap cap_mac_override cap_mac_admin cap_syslog cap_wake_alarm cap_block_suspend cap_audit_read cap_perfmon cap_bpf cap_checkpoint_restore", "CleanResult": "success", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ConfigurationDirectoryMode": "0755", "ControlGroupId": "0", "ControlPID": "0", "CoredumpFilter": "0x33", "DefaultDependencies": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "DevicePolicy": "auto", "DynamicUser": "no", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExitType": "main", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FinalKillSignal": "9", "FragmentPath": "/etc/systemd/system/systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "FreezerState": "running", "GID": "[not set]", "GuessMainPID": "yes", "IOAccounting": "no", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "IOWeight": "[not set]", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "IPAccounting": "no", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "Id": "systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service", "IgnoreOnIsolate": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "JobTimeoutUSec": "infinity", "KeyringMode": "private", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitMEMLOCK": "454168576", "LimitMEMLOCKSoft": "454168576", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitNOFILE": "1073741816", "LimitNOFILESoft": "1073741816", "LimitNPROC": "13683", "LimitNPROCSoft": "13683", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "LimitSIGPENDING": "13683", "LimitSIGPENDINGSoft": "13683", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LoadError": "org.freedesktop.systemd1.UnitMasked \"Unit systemd-cryptsetup@luks\\x2d1edafecf\\x2d4d8e\\x2d4f78\\x2daff4\\x2dede6c0f23479.service is masked.\"", "LoadState": "masked", "LockPersonality": "no", "LogLevelMax": "-1", "LogRateLimitBurst": "0", "LogRateLimitIntervalUSec": "0", "LogsDirectoryMode": "0755", "MainPID": "0", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "ManagedOOMSwap": "auto", "MemoryAccounting": "yes", "MemoryAvailable": "infinity", "MemoryCurrent": "[not set]", "MemoryDenyWriteExecute": "no", "MemoryHigh": "infinity", "MemoryLimit": "infinity", "MemoryLow": "0", "MemoryMax": "infinity", "MemoryMin": "0", "MemoryPeak": "18446744073709551615", "MemorySwapMax": "infinity", "MountAPIVFS": "no", "NFileDescriptorStore": "0", "NRestarts": "0", "NUMAPolicy": "n/a", "Names": "\"systemd-cryptsetup@luks\\\\x2d1edafecf\\\\x2d4d8e\\\\x2d4f78\\\\x2daff4\\\\x2dede6c0f23479.service\"", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "OnSuccessJobMode": "fail", "Perpetual": "no", "PrivateDevices": "no", "PrivateIPC": "no", "PrivateMounts": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "PrivateUsers": "no", "ProcSubset": "all", "ProtectClock": "no", "ProtectControlGroups": "no", "ProtectHome": "no", "ProtectHostname": "no", "ProtectKernelLogs": "no", "ProtectKernelModules": "no", "ProtectKernelTunables": "no", "ProtectProc": "default", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "ReloadResult": "success", "ReloadSignal": "1", "RemainAfterExit": "no", "RemoveIPC": "no", "Restart": "no", "RestartKillSignal": "15", "RestartUSec": "100ms", "RestrictNamespaces": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "RuntimeDirectoryPreserve": "no", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitIntervalUSec": "10s", "StartupBlockIOWeight": "[not set]", "StartupCPUShares": "[not set]", "StartupCPUWeight": "[not set]", "StartupIOWeight": "[not set]", "StateChangeTimestampMonotonic": "0", "StateDirectoryMode": "0755", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SuccessAction": "none", "SyslogFacility": "3", "SyslogLevel": "6", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "2147483646", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "yes", "TasksCurrent": "[not set]", "TasksMax": "21893", "TimeoutAbortUSec": "1min 30s", "TimeoutCleanUSec": "infinity", "TimeoutStartFailureMode": "terminate", "TimeoutStartUSec": "1min 30s", "TimeoutStopFailureMode": "terminate", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UID": "[not set]", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "masked", "UtmpMode": "init", "WatchdogSignal": "6", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "infinity" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:22 Thursday 16 April 2026 19:42:24 -0400 (0:00:01.420) 0:20:00.299 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:27 Thursday 16 April 2026 19:42:24 -0400 (0:00:00.208) 0:20:00.508 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:38 Thursday 16 April 2026 19:42:25 -0400 (0:00:00.532) 0:20:01.041 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__storage_failed_exception is defined", "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Thursday 16 April 2026 19:42:25 -0400 (0:00:00.216) 0:20:01.258 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382918.6329248, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776382918.6329248, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1776382918.6329248, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "934789368", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Thursday 16 April 2026 19:42:26 -0400 (0:00:00.776) 0:20:02.034 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume - 3] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:484 Thursday 16 April 2026 19:42:26 -0400 (0:00:00.167) 0:20:02.206 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node13 TASK [Clear facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Thursday 16 April 2026 19:42:26 -0400 (0:00:00.189) 0:20:02.395 ******** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Thursday 16 April 2026 19:42:26 -0400 (0:00:00.001) 0:20:02.397 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Thursday 16 April 2026 19:42:26 -0400 (0:00:00.058) 0:20:02.456 ******** included: fedora.linux_system_roles.storage for managed-node13 TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 16 April 2026 19:42:26 -0400 (0:00:00.210) 0:20:02.666 ******** ok: [managed-node13] => { "ansible_facts": { "discovered_interpreter_python": "/usr/bin/python3.9" }, "changed": false, "message": "Message written to syslog" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:7 Thursday 16 April 2026 19:42:27 -0400 (0:00:01.151) 0:20:03.817 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 16 April 2026 19:42:28 -0400 (0:00:00.229) 0:20:04.047 ******** ok: [managed-node13] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 16 April 2026 19:42:29 -0400 (0:00:01.516) 0:20:05.563 ******** skipping: [managed-node13] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Thursday 16 April 2026 19:42:29 -0400 (0:00:00.370) 0:20:05.933 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Thursday 16 April 2026 19:42:30 -0400 (0:00:00.287) 0:20:06.220 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:10 Thursday 16 April 2026 19:42:30 -0400 (0:00:00.290) 0:20:06.511 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:14 Thursday 16 April 2026 19:42:30 -0400 (0:00:00.140) 0:20:06.652 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:18 Thursday 16 April 2026 19:42:30 -0400 (0:00:00.176) 0:20:06.828 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Thursday 16 April 2026 19:42:31 -0400 (0:00:00.615) 0:20:07.444 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Thursday 16 April 2026 19:42:31 -0400 (0:00:00.163) 0:20:07.608 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Thursday 16 April 2026 19:42:31 -0400 (0:00:00.228) 0:20:07.836 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Thursday 16 April 2026 19:42:33 -0400 (0:00:01.653) 0:20:09.490 ******** ok: [managed-node13] => { "storage_pools | d([])": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Thursday 16 April 2026 19:42:33 -0400 (0:00:00.228) 0:20:09.718 ******** ok: [managed-node13] => { "storage_volumes | d([])": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Thursday 16 April 2026 19:42:33 -0400 (0:00:00.241) 0:20:09.959 ******** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Thursday 16 April 2026 19:42:35 -0400 (0:00:02.062) 0:20:12.022 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 16 April 2026 19:42:36 -0400 (0:00:00.513) 0:20:12.536 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 16 April 2026 19:42:36 -0400 (0:00:00.181) 0:20:12.717 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Thursday 16 April 2026 19:42:36 -0400 (0:00:00.262) 0:20:12.980 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Thursday 16 April 2026 19:42:37 -0400 (0:00:00.192) 0:20:13.172 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Thursday 16 April 2026 19:42:39 -0400 (0:00:01.900) 0:20:15.072 ******** ok: [managed-node13] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Thursday 16 April 2026 19:42:41 -0400 (0:00:02.803) 0:20:17.876 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Thursday 16 April 2026 19:42:42 -0400 (0:00:00.481) 0:20:18.357 ******** changed: [managed-node13] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Thursday 16 April 2026 19:42:55 -0400 (0:00:13.404) 0:20:31.761 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Thursday 16 April 2026 19:42:56 -0400 (0:00:00.276) 0:20:32.038 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382842.1029055, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "73c0f2136f47f4644a204bdff45da762b62fbab3", "ctime": 1776382842.0989053, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 218104009, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776382842.0989053, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1458, "uid": 0, "version": "292099060", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Thursday 16 April 2026 19:42:57 -0400 (0:00:01.237) 0:20:33.276 ******** ok: [managed-node13] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Thursday 16 April 2026 19:42:58 -0400 (0:00:01.380) 0:20:34.656 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Thursday 16 April 2026 19:42:59 -0400 (0:00:00.513) 0:20:35.170 ******** ok: [managed-node13] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "state": "mounted" } ], "packages": [ "lvm2", "xfsprogs", "cryptsetup" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 16 April 2026 19:42:59 -0400 (0:00:00.373) 0:20:35.543 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Thursday 16 April 2026 19:42:59 -0400 (0:00:00.342) 0:20:35.885 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Thursday 16 April 2026 19:43:00 -0400 (0:00:00.259) 0:20:36.145 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node13] => (item={'src': '/dev/mapper/foo-test1', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Thursday 16 April 2026 19:43:01 -0400 (0:00:01.465) 0:20:37.610 ******** ok: [managed-node13] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Thursday 16 April 2026 19:43:03 -0400 (0:00:01.600) 0:20:39.211 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node13] => (item={'src': '/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Thursday 16 April 2026 19:43:04 -0400 (0:00:01.688) 0:20:40.899 ******** skipping: [managed-node13] => (item={'src': '/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601', 'path': '/opt/test1', 'fstype': 'xfs', 'opts': 'defaults', 'dump': 0, 'passno': 0, 'state': 'mounted', 'owner': None, 'group': None, 'mode': None}) => { "ansible_loop_var": "mount_info", "changed": false, "false_condition": "mount_info['owner'] != none or mount_info['group'] != none or mount_info['mode'] != none", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "state": "mounted" }, "skip_reason": "Conditional result was False" } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Thursday 16 April 2026 19:43:05 -0400 (0:00:00.552) 0:20:41.452 ******** ok: [managed-node13] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Thursday 16 April 2026 19:43:07 -0400 (0:00:01.710) 0:20:43.163 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382853.7009084, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1776382847.0669067, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 566231242, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1776382847.0677707, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "364645587", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Thursday 16 April 2026 19:43:08 -0400 (0:00:01.375) 0:20:44.538 ******** changed: [managed-node13] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-86f9e8c5-912e-4eca-9538-fec63ecd5601', 'password': '-', 'state': 'present'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Thursday 16 April 2026 19:43:09 -0400 (0:00:01.456) 0:20:45.994 ******** ok: [managed-node13] TASK [Verify role results - 10] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:499 Thursday 16 April 2026 19:43:11 -0400 (0:00:01.864) 0:20:47.859 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node13 TASK [Print out pool information] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Thursday 16 April 2026 19:43:12 -0400 (0:00:00.887) 0:20:48.746 ******** ok: [managed-node13] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Thursday 16 April 2026 19:43:13 -0400 (0:00:00.403) 0:20:49.150 ******** skipping: [managed-node13] => { "false_condition": "_storage_volumes_list | length > 0" } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Thursday 16 April 2026 19:43:13 -0400 (0:00:00.203) 0:20:49.354 ******** ok: [managed-node13] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "86f9e8c5-912e-4eca-9538-fec63ecd5601" }, "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "size": "4G", "type": "crypt", "uuid": "7d53cced-da74-43d9-b23d-4debd69fc280" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "CeOF2F-JH5x-FPcS-IIeE-QvP6-4d1m-bl1dga" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fb6c01b6-de24-4b5b-a70c-3231ceae3fbd" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Thursday 16 April 2026 19:43:14 -0400 (0:00:01.198) 0:20:50.552 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002984", "end": "2026-04-16 19:43:15.558845", "rc": 0, "start": "2026-04-16 19:43:15.555861" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Apr 14 06:59:53 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fb6c01b6-de24-4b5b-a70c-3231ceae3fbd / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 /dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Thursday 16 April 2026 19:43:15 -0400 (0:00:01.203) 0:20:51.756 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003090", "end": "2026-04-16 19:43:16.837919", "failed_when_result": false, "rc": 0, "start": "2026-04-16 19:43:16.834829" } STDOUT: luks-86f9e8c5-912e-4eca-9538-fec63ecd5601 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Thursday 16 April 2026 19:43:17 -0400 (0:00:01.280) 0:20:53.037 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node13 => (item={'disks': ['sda'], 'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'encryption_clevis_pin': None, 'encryption_tang_url': None, 'encryption_tang_thumbprint': None, 'grow_to_fill': False, 'name': 'foo', 'raid_level': None, 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'shared': False, 'state': 'present', 'type': 'lvm', 'volumes': [{'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}]}) TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Thursday 16 April 2026 19:43:17 -0400 (0:00:00.480) 0:20:53.517 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Thursday 16 April 2026 19:43:17 -0400 (0:00:00.245) 0:20:53.762 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.031220", "end": "2026-04-16 19:43:18.740005", "rc": 0, "start": "2026-04-16 19:43:18.708785" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Thursday 16 April 2026 19:43:18 -0400 (0:00:01.200) 0:20:54.962 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Thursday 16 April 2026 19:43:19 -0400 (0:00:00.340) 0:20:55.302 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node13 => (item=members) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node13 => (item=volumes) TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Thursday 16 April 2026 19:43:19 -0400 (0:00:00.620) 0:20:55.923 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Thursday 16 April 2026 19:43:20 -0400 (0:00:00.428) 0:20:56.351 ******** ok: [managed-node13] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Thursday 16 April 2026 19:43:21 -0400 (0:00:01.179) 0:20:57.531 ******** ok: [managed-node13] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Thursday 16 April 2026 19:43:21 -0400 (0:00:00.267) 0:20:57.798 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Thursday 16 April 2026 19:43:22 -0400 (0:00:00.368) 0:20:58.167 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Thursday 16 April 2026 19:43:22 -0400 (0:00:00.399) 0:20:58.566 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 2] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Thursday 16 April 2026 19:43:22 -0400 (0:00:00.367) 0:20:58.934 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type - 3] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Thursday 16 April 2026 19:43:23 -0400 (0:00:00.363) 0:20:59.298 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_pool.raid_level is none", "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:55 Thursday 16 April 2026 19:43:23 -0400 (0:00:00.279) 0:20:59.583 ******** ok: [managed-node13] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:68 Thursday 16 April 2026 19:43:23 -0400 (0:00:00.308) 0:20:59.891 ******** ok: [managed-node13] => { "changed": false, "failed_when_result": false, "rc": 0 } STDOUT: ** (process:124904): WARNING **: 19:43:24.801: failed to load module nvme: libbd_nvme.so.2: cannot open shared object file: No such file or directory STDERR: OpenSSH_9.9p1, OpenSSL 3.5.5 27 Jan 2026 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.43.82 originally 10.31.43.82 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.43.82 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.43.82 originally 10.31.43.82 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/983a9e969b' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.43.82 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:78 Thursday 16 April 2026 19:43:25 -0400 (0:00:01.165) 0:21:01.057 ******** skipping: [managed-node13] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "false_condition": "storage_test_pool.grow_to_fill | bool", "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:88 Thursday 16 April 2026 19:43:25 -0400 (0:00:00.314) 0:21:01.372 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node13 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Thursday 16 April 2026 19:43:25 -0400 (0:00:00.532) 0:21:01.904 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Thursday 16 April 2026 19:43:26 -0400 (0:00:00.189) 0:21:02.093 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Thursday 16 April 2026 19:43:26 -0400 (0:00:00.197) 0:21:02.291 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Thursday 16 April 2026 19:43:26 -0400 (0:00:00.349) 0:21:02.642 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Thursday 16 April 2026 19:43:26 -0400 (0:00:00.166) 0:21:02.809 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Thursday 16 April 2026 19:43:26 -0400 (0:00:00.160) 0:21:02.969 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Thursday 16 April 2026 19:43:27 -0400 (0:00:00.223) 0:21:03.192 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Thursday 16 April 2026 19:43:27 -0400 (0:00:00.199) 0:21:03.392 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Thursday 16 April 2026 19:43:27 -0400 (0:00:00.196) 0:21:03.589 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Thursday 16 April 2026 19:43:27 -0400 (0:00:00.187) 0:21:03.776 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.raid_level != none", "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Thursday 16 April 2026 19:43:27 -0400 (0:00:00.181) 0:21:03.958 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:91 Thursday 16 April 2026 19:43:28 -0400 (0:00:00.259) 0:21:04.217 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node13 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Thursday 16 April 2026 19:43:28 -0400 (0:00:00.539) 0:21:04.757 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node13 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Thursday 16 April 2026 19:43:29 -0400 (0:00:00.503) 0:21:05.260 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Thursday 16 April 2026 19:43:29 -0400 (0:00:00.212) 0:21:05.472 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Thursday 16 April 2026 19:43:29 -0400 (0:00:00.280) 0:21:05.753 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Thursday 16 April 2026 19:43:30 -0400 (0:00:00.333) 0:21:06.086 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Thursday 16 April 2026 19:43:30 -0400 (0:00:00.250) 0:21:06.337 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Thursday 16 April 2026 19:43:30 -0400 (0:00:00.300) 0:21:06.638 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Thursday 16 April 2026 19:43:30 -0400 (0:00:00.317) 0:21:06.955 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_lvmraid_volume.raid_level is not none", "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:94 Thursday 16 April 2026 19:43:31 -0400 (0:00:00.341) 0:21:07.296 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node13 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Thursday 16 April 2026 19:43:31 -0400 (0:00:00.680) 0:21:07.977 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node13 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Thursday 16 April 2026 19:43:32 -0400 (0:00:00.473) 0:21:08.450 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Thursday 16 April 2026 19:43:32 -0400 (0:00:00.309) 0:21:08.760 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Thursday 16 April 2026 19:43:32 -0400 (0:00:00.234) 0:21:08.994 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_thin_volume.thin", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Thursday 16 April 2026 19:43:33 -0400 (0:00:00.264) 0:21:09.259 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:97 Thursday 16 April 2026 19:43:33 -0400 (0:00:00.331) 0:21:09.591 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node13 TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Thursday 16 April 2026 19:43:34 -0400 (0:00:00.730) 0:21:10.321 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Thursday 16 April 2026 19:43:34 -0400 (0:00:00.289) 0:21:10.611 ******** skipping: [managed-node13] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.encryption", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => { "changed": false } MSG: All items skipped TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Thursday 16 April 2026 19:43:34 -0400 (0:00:00.287) 0:21:10.899 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node13 => (item=/dev/sda) TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Thursday 16 April 2026 19:43:35 -0400 (0:00:00.641) 0:21:11.540 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Thursday 16 April 2026 19:43:36 -0400 (0:00:00.532) 0:21:12.072 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Thursday 16 April 2026 19:43:36 -0400 (0:00:00.354) 0:21:12.426 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Thursday 16 April 2026 19:43:36 -0400 (0:00:00.243) 0:21:12.670 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "false and _storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Thursday 16 April 2026 19:43:36 -0400 (0:00:00.298) 0:21:12.968 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Thursday 16 April 2026 19:43:37 -0400 (0:00:00.250) 0:21:13.219 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Thursday 16 April 2026 19:43:37 -0400 (0:00:00.223) 0:21:13.443 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:100 Thursday 16 April 2026 19:43:37 -0400 (0:00:00.285) 0:21:13.728 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node13 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Thursday 16 April 2026 19:43:38 -0400 (0:00:00.692) 0:21:14.421 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node13 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Thursday 16 April 2026 19:43:39 -0400 (0:00:00.623) 0:21:15.045 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Thursday 16 April 2026 19:43:39 -0400 (0:00:00.230) 0:21:15.275 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Thursday 16 April 2026 19:43:39 -0400 (0:00:00.195) 0:21:15.471 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Thursday 16 April 2026 19:43:39 -0400 (0:00:00.331) 0:21:15.802 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off - 2] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Thursday 16 April 2026 19:43:40 -0400 (0:00:00.273) 0:21:16.076 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on - 2] ************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Thursday 16 April 2026 19:43:40 -0400 (0:00:00.206) 0:21:16.282 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_vdo_volume.deduplication != none or storage_test_vdo_volume.compression != none", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Thursday 16 April 2026 19:43:40 -0400 (0:00:00.260) 0:21:16.544 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:103 Thursday 16 April 2026 19:43:40 -0400 (0:00:00.178) 0:21:16.723 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node13 TASK [Get stratis pool information] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Thursday 16 April 2026 19:43:41 -0400 (0:00:00.906) 0:21:17.630 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Print script output] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Thursday 16 April 2026 19:43:41 -0400 (0:00:00.269) 0:21:17.900 ******** skipping: [managed-node13] => { "false_condition": "storage_test_pool.type == 'stratis'" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:19 Thursday 16 April 2026 19:43:42 -0400 (0:00:00.293) 0:21:18.194 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:23 Thursday 16 April 2026 19:43:42 -0400 (0:00:00.237) 0:21:18.437 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:30 Thursday 16 April 2026 19:43:42 -0400 (0:00:00.274) 0:21:18.711 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:39 Thursday 16 April 2026 19:43:42 -0400 (0:00:00.237) 0:21:18.948 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_pool.type == 'stratis'", "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:49 Thursday 16 April 2026 19:43:43 -0400 (0:00:00.246) 0:21:19.194 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:106 Thursday 16 April 2026 19:43:43 -0400 (0:00:00.254) 0:21:19.449 ******** ok: [managed-node13] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Thursday 16 April 2026 19:43:43 -0400 (0:00:00.285) 0:21:19.734 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node13 => (item={'encryption': True, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': 'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', 'fs_create_options': '', 'fs_label': '', 'fs_type': 'xfs', 'mount_options': 'defaults', 'mount_point': '/opt/test1', 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'test1', 'raid_level': None, 'size': '4g', 'state': 'present', 'type': 'lvm', 'cached': False, 'cache_devices': [], 'cache_mode': None, 'cache_size': 0, 'compression': None, 'deduplication': None, 'part_type': None, 'raid_disks': [], 'raid_stripe_size': None, 'thin_pool_name': None, 'thin_pool_size': None, 'thin': False, 'vdo_pool_size': None, 'disks': ['sda'], 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_device_count': None, 'raid_spare_count': None, 'raid_chunk_size': None, 'raid_metadata_version': None, '_device': '/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601', '_raw_device': '/dev/mapper/foo-test1', '_mount_id': '/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601', '_kernel_device': '/dev/dm-1', '_raw_kernel_device': '/dev/dm-0'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Thursday 16 April 2026 19:43:44 -0400 (0:00:00.506) 0:21:20.240 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Thursday 16 April 2026 19:43:44 -0400 (0:00:00.339) 0:21:20.580 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node13 => (item=mount) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node13 => (item=fstab) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node13 => (item=fs) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node13 => (item=device) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node13 => (item=encryption) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node13 => (item=md) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node13 => (item=size) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node13 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Thursday 16 April 2026 19:43:46 -0400 (0:00:01.952) 0:21:22.533 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Thursday 16 April 2026 19:43:46 -0400 (0:00:00.425) 0:21:22.958 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Thursday 16 April 2026 19:43:47 -0400 (0:00:00.428) 0:21:23.387 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "(not storage_test_volume.mount_user is none and storage_test_volume.mount_user | length > 0) or (not storage_test_volume.mount_group is none and storage_test_volume.mount_group | length > 0) or (not storage_test_volume.mount_mode is none and storage_test_volume.mount_mode | length > 0)", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Thursday 16 April 2026 19:43:47 -0400 (0:00:00.568) 0:21:23.955 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Thursday 16 April 2026 19:43:48 -0400 (0:00:00.350) 0:21:24.305 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_user is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Thursday 16 April 2026 19:43:48 -0400 (0:00:00.443) 0:21:24.749 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_group is none", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Thursday 16 April 2026 19:43:49 -0400 (0:00:00.381) 0:21:25.130 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.mount_mode is none", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Thursday 16 April 2026 19:43:49 -0400 (0:00:00.338) 0:21:25.468 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Thursday 16 April 2026 19:43:49 -0400 (0:00:00.261) 0:21:25.751 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Thursday 16 April 2026 19:43:49 -0400 (0:00:00.249) 0:21:26.000 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Thursday 16 April 2026 19:43:50 -0400 (0:00:00.221) 0:21:26.222 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Thursday 16 April 2026 19:43:50 -0400 (0:00:00.229) 0:21:26.452 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Thursday 16 April 2026 19:43:51 -0400 (0:00:00.631) 0:21:27.083 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Thursday 16 April 2026 19:43:51 -0400 (0:00:00.483) 0:21:27.566 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Thursday 16 April 2026 19:43:51 -0400 (0:00:00.347) 0:21:27.914 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Thursday 16 April 2026 19:43:52 -0400 (0:00:00.322) 0:21:28.237 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Thursday 16 April 2026 19:43:54 -0400 (0:00:01.966) 0:21:30.203 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Thursday 16 April 2026 19:43:54 -0400 (0:00:00.322) 0:21:30.526 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Thursday 16 April 2026 19:43:54 -0400 (0:00:00.474) 0:21:31.000 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Thursday 16 April 2026 19:43:55 -0400 (0:00:00.438) 0:21:31.438 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382975.281939, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1776382975.281939, "dev": 6, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2049, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776382975.281939, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Thursday 16 April 2026 19:43:56 -0400 (0:00:01.120) 0:21:32.559 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Thursday 16 April 2026 19:43:56 -0400 (0:00:00.264) 0:21:32.823 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Thursday 16 April 2026 19:43:57 -0400 (0:00:00.237) 0:21:33.061 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Thursday 16 April 2026 19:43:57 -0400 (0:00:00.362) 0:21:33.424 ******** ok: [managed-node13] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Thursday 16 April 2026 19:43:57 -0400 (0:00:00.345) 0:21:33.769 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Thursday 16 April 2026 19:43:58 -0400 (0:00:00.293) 0:21:34.063 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Thursday 16 April 2026 19:43:58 -0400 (0:00:00.403) 0:21:34.466 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382975.5429392, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1776382975.5429392, "dev": 6, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 2125, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1776382975.5429392, "nlink": 1, "path": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Thursday 16 April 2026 19:43:59 -0400 (0:00:01.388) 0:21:35.854 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Thursday 16 April 2026 19:44:01 -0400 (0:00:02.023) 0:21:37.878 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.007471", "end": "2026-04-16 19:44:02.886145", "rc": 0, "start": "2026-04-16 19:44:02.878674" } STDOUT: LUKS header information Version: 2 Epoch: 3 Metadata area: 16384 [bytes] Keyslots area: 16744448 [bytes] UUID: 86f9e8c5-912e-4eca-9538-fec63ecd5601 Label: (no label) Subsystem: (no subsystem) Flags: (no flags) Data segments: 0: crypt offset: 16777216 [bytes] length: (whole device) cipher: aes-xts-plain64 sector: 512 [bytes] Keyslots: 0: luks2 Key: 512 bits Priority: normal Cipher: aes-xts-plain64 Cipher key: 512 bits PBKDF: argon2id Time cost: 4 Memory: 688159 Threads: 2 Salt: 61 5b 68 af 2a 71 34 41 99 8f 05 48 c5 fc e3 4a cc a7 30 45 b6 10 ac eb 69 55 1c 62 fe 78 af 6f AF stripes: 4000 AF hash: sha256 Area offset:32768 [bytes] Area length:258048 [bytes] Digest ID: 0 Tokens: Digests: 0: pbkdf2 Hash: sha256 Iterations: 131995 Salt: e8 f7 cd 1d 90 9f 70 d5 2c 4d c2 52 17 72 27 84 63 32 ee b8 47 83 7a 92 bf c4 d0 41 d5 d2 23 16 Digest: bf 9d bf 98 41 91 74 d5 10 16 93 8a 6b 48 1f 46 93 08 7c b9 46 38 29 8f bf d4 89 45 db ed 28 9f TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Thursday 16 April 2026 19:44:03 -0400 (0:00:01.317) 0:21:39.195 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Thursday 16 April 2026 19:44:03 -0400 (0:00:00.370) 0:21:39.566 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Thursday 16 April 2026 19:44:03 -0400 (0:00:00.444) 0:21:40.011 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Thursday 16 April 2026 19:44:04 -0400 (0:00:00.366) 0:21:40.377 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Thursday 16 April 2026 19:44:04 -0400 (0:00:00.323) 0:21:40.700 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.encryption_luks_version is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Thursday 16 April 2026 19:44:05 -0400 (0:00:00.364) 0:21:41.065 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.encryption_key_size is none", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Thursday 16 April 2026 19:44:05 -0400 (0:00:00.416) 0:21:41.481 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.encryption_cipher is none", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Thursday 16 April 2026 19:44:05 -0400 (0:00:00.347) 0:21:41.828 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-86f9e8c5-912e-4eca-9538-fec63ecd5601 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Thursday 16 April 2026 19:44:06 -0400 (0:00:00.624) 0:21:42.453 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Thursday 16 April 2026 19:44:06 -0400 (0:00:00.374) 0:21:42.828 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Thursday 16 April 2026 19:44:07 -0400 (0:00:00.387) 0:21:43.219 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Thursday 16 April 2026 19:44:07 -0400 (0:00:00.469) 0:21:43.688 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Thursday 16 April 2026 19:44:07 -0400 (0:00:00.291) 0:21:43.980 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Thursday 16 April 2026 19:44:08 -0400 (0:00:00.268) 0:21:44.248 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Thursday 16 April 2026 19:44:08 -0400 (0:00:00.198) 0:21:44.447 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Thursday 16 April 2026 19:44:08 -0400 (0:00:00.211) 0:21:44.658 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Thursday 16 April 2026 19:44:08 -0400 (0:00:00.225) 0:21:44.884 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Thursday 16 April 2026 19:44:09 -0400 (0:00:00.257) 0:21:45.141 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Thursday 16 April 2026 19:44:09 -0400 (0:00:00.303) 0:21:45.445 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Thursday 16 April 2026 19:44:09 -0400 (0:00:00.267) 0:21:45.713 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Thursday 16 April 2026 19:44:09 -0400 (0:00:00.224) 0:21:45.937 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Thursday 16 April 2026 19:44:10 -0400 (0:00:00.237) 0:21:46.175 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Thursday 16 April 2026 19:44:10 -0400 (0:00:00.288) 0:21:46.464 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Thursday 16 April 2026 19:44:10 -0400 (0:00:00.320) 0:21:46.785 ******** ok: [managed-node13] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Thursday 16 April 2026 19:44:12 -0400 (0:00:01.305) 0:21:48.091 ******** ok: [managed-node13] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Thursday 16 April 2026 19:44:13 -0400 (0:00:01.423) 0:21:49.514 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Thursday 16 April 2026 19:44:13 -0400 (0:00:00.381) 0:21:49.896 ******** ok: [managed-node13] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Thursday 16 April 2026 19:44:14 -0400 (0:00:00.249) 0:21:50.146 ******** ok: [managed-node13] => { "bytes": 10726680821, "changed": false, "lvm": "9g", "parted": "9GiB", "size": "9 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Thursday 16 April 2026 19:44:15 -0400 (0:00:01.273) 0:21:51.419 ******** skipping: [managed-node13] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Thursday 16 April 2026 19:44:15 -0400 (0:00:00.398) 0:21:51.818 ******** skipping: [managed-node13] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Thursday 16 April 2026 19:44:16 -0400 (0:00:00.361) 0:21:52.179 ******** skipping: [managed-node13] => { "false_condition": "'%' in storage_test_volume.size | string" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Thursday 16 April 2026 19:44:16 -0400 (0:00:00.324) 0:21:52.504 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "'%' in storage_test_volume.size | string", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Thursday 16 April 2026 19:44:16 -0400 (0:00:00.320) 0:21:52.824 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Thursday 16 April 2026 19:44:17 -0400 (0:00:00.302) 0:21:53.126 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Thursday 16 April 2026 19:44:17 -0400 (0:00:00.340) 0:21:53.466 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Thursday 16 April 2026 19:44:17 -0400 (0:00:00.293) 0:21:53.760 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Thursday 16 April 2026 19:44:17 -0400 (0:00:00.241) 0:21:54.002 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Thursday 16 April 2026 19:44:18 -0400 (0:00:00.226) 0:21:54.229 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Thursday 16 April 2026 19:44:18 -0400 (0:00:00.252) 0:21:54.481 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Thursday 16 April 2026 19:44:18 -0400 (0:00:00.253) 0:21:54.734 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Thursday 16 April 2026 19:44:18 -0400 (0:00:00.233) 0:21:54.967 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Thursday 16 April 2026 19:44:19 -0400 (0:00:00.292) 0:21:55.260 ******** skipping: [managed-node13] => { "false_condition": "storage_test_volume.thin | bool" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Thursday 16 April 2026 19:44:19 -0400 (0:00:00.358) 0:21:55.618 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Thursday 16 April 2026 19:44:19 -0400 (0:00:00.391) 0:21:56.009 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Thursday 16 April 2026 19:44:20 -0400 (0:00:00.326) 0:21:56.337 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Thursday 16 April 2026 19:44:20 -0400 (0:00:00.362) 0:21:56.700 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Thursday 16 April 2026 19:44:21 -0400 (0:00:00.376) 0:21:57.077 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.thin | bool", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Thursday 16 April 2026 19:44:21 -0400 (0:00:00.417) 0:21:57.495 ******** ok: [managed-node13] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Thursday 16 April 2026 19:44:21 -0400 (0:00:00.371) 0:21:57.866 ******** ok: [managed-node13] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Thursday 16 April 2026 19:44:22 -0400 (0:00:00.316) 0:21:58.182 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Thursday 16 April 2026 19:44:22 -0400 (0:00:00.405) 0:21:58.588 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.029519", "end": "2026-04-16 19:44:23.701365", "rc": 0, "start": "2026-04-16 19:44:23.671846" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Thursday 16 April 2026 19:44:23 -0400 (0:00:01.288) 0:21:59.877 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Thursday 16 April 2026 19:44:24 -0400 (0:00:00.379) 0:22:00.256 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Thursday 16 April 2026 19:44:24 -0400 (0:00:00.279) 0:22:00.535 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Thursday 16 April 2026 19:44:24 -0400 (0:00:00.254) 0:22:00.790 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Thursday 16 April 2026 19:44:24 -0400 (0:00:00.229) 0:22:01.020 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Thursday 16 April 2026 19:44:25 -0400 (0:00:00.300) 0:22:01.320 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.cached | bool", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Thursday 16 April 2026 19:44:25 -0400 (0:00:00.225) 0:22:01.546 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Thursday 16 April 2026 19:44:25 -0400 (0:00:00.238) 0:22:01.784 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Thursday 16 April 2026 19:44:26 -0400 (0:00:00.353) 0:22:02.138 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:502 Thursday 16 April 2026 19:44:26 -0400 (0:00:00.386) 0:22:02.524 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml for managed-node13 TASK [Clear facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:9 Thursday 16 April 2026 19:44:27 -0400 (0:00:00.882) 0:22:03.407 ******** META: facts cleared TASK [Run the role] ************************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:23 Thursday 16 April 2026 19:44:27 -0400 (0:00:00.052) 0:22:03.460 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__sr_failed_when is defined", "skip_reason": "Conditional result was False" } TASK [Run the role normally] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tasks/run_role_with_clear_facts.yml:33 Thursday 16 April 2026 19:44:27 -0400 (0:00:00.253) 0:22:03.713 ******** included: fedora.linux_system_roles.storage for managed-node13 TASK [fedora.linux_system_roles.storage : Record storage role fingerprint in syslog] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Thursday 16 April 2026 19:44:28 -0400 (0:00:00.440) 0:22:04.154 ******** ok: [managed-node13] => { "ansible_facts": { "discovered_interpreter_python": "/usr/bin/python3.9" }, "changed": false, "message": "Message written to syslog" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:7 Thursday 16 April 2026 19:44:29 -0400 (0:00:01.084) 0:22:05.238 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Thursday 16 April 2026 19:44:29 -0400 (0:00:00.233) 0:22:05.471 ******** ok: [managed-node13] TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Thursday 16 April 2026 19:44:31 -0400 (0:00:01.778) 0:22:07.250 ******** skipping: [managed-node13] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node13] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node13] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_facts['architecture'] == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Thursday 16 April 2026 19:44:31 -0400 (0:00:00.552) 0:22:07.802 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Thursday 16 April 2026 19:44:32 -0400 (0:00:00.269) 0:22:08.072 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not __storage_is_ostree is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:10 Thursday 16 April 2026 19:44:32 -0400 (0:00:00.360) 0:22:08.432 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:14 Thursday 16 April 2026 19:44:32 -0400 (0:00:00.227) 0:22:08.660 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:18 Thursday 16 April 2026 19:44:32 -0400 (0:00:00.245) 0:22:08.906 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Add repo key] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Thursday 16 April 2026 19:44:33 -0400 (0:00:00.686) 0:22:09.592 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Add blivet repo] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:15 Thursday 16 April 2026 19:44:33 -0400 (0:00:00.303) 0:22:09.896 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_blivet_custom_repo.key is defined", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Thursday 16 April 2026 19:44:34 -0400 (0:00:00.340) 0:22:10.237 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:27 Thursday 16 April 2026 19:44:35 -0400 (0:00:01.665) 0:22:11.902 ******** ok: [managed-node13] => { "storage_pools | d([])": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:32 Thursday 16 April 2026 19:44:36 -0400 (0:00:00.195) 0:22:12.097 ******** ok: [managed-node13] => { "storage_volumes | d([])": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Thursday 16 April 2026 19:44:36 -0400 (0:00:00.242) 0:22:12.340 ******** ok: [managed-node13] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:50 Thursday 16 April 2026 19:44:38 -0400 (0:00:02.267) 0:22:14.607 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node13 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Thursday 16 April 2026 19:44:39 -0400 (0:00:00.437) 0:22:15.045 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Thursday 16 April 2026 19:44:39 -0400 (0:00:00.211) 0:22:15.256 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "install_copr | d(false) | bool", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Thursday 16 April 2026 19:44:39 -0400 (0:00:00.206) 0:22:15.462 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:56 Thursday 16 April 2026 19:44:39 -0400 (0:00:00.302) 0:22:15.765 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Thursday 16 April 2026 19:44:41 -0400 (0:00:01.930) 0:22:17.696 ******** ok: [managed-node13] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "apt-daily.service": { "name": "apt-daily.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autofs.service": { "name": "autofs.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "alias" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd-restricted.service": { "name": "chronyd-restricted.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-hotplugd.service": { "name": "cloud-init-hotplugd.service", "source": "systemd", "state": "inactive", "status": "static" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-broker.service": { "name": "dbus-broker.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "alias" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "alias" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "active", "status": "alias" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "display-manager.service": { "name": "display-manager.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-makecache.service": { "name": "dnf-makecache.service", "source": "systemd", "state": "stopped", "status": "static" }, "dnf-system-upgrade-cleanup.service": { "name": "dnf-system-upgrade-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "dnf-system-upgrade.service": { "name": "dnf-system-upgrade.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown-onfailure.service": { "name": "dracut-shutdown-onfailure.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "fcoe.service": { "name": "fcoe.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "active" }, "grub-boot-indeterminate.service": { "name": "grub-boot-indeterminate.service", "source": "systemd", "state": "inactive", "status": "static" }, "grub2-systemd-integration.service": { "name": "grub2-systemd-integration.service", "source": "systemd", "state": "inactive", "status": "static" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "hv_kvp_daemon.service": { "name": "hv_kvp_daemon.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "iscsi-shutdown.service": { "name": "iscsi-shutdown.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsi.service": { "name": "iscsi.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "iscsid.service": { "name": "iscsid.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "kvm_stat.service": { "name": "kvm_stat.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "ldconfig.service": { "name": "ldconfig.service", "source": "systemd", "state": "stopped", "status": "static" }, "logrotate.service": { "name": "logrotate.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm-devices-import.service": { "name": "lvm-devices-import.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "lvm2-activation-early.service": { "name": "lvm2-activation-early.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "man-db-cache-update.service": { "name": "man-db-cache-update.service", "source": "systemd", "state": "inactive", "status": "static" }, "man-db-restart-cache-update.service": { "name": "man-db-restart-cache-update.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "modprobe@.service": { "name": "modprobe@.service", "source": "systemd", "state": "unknown", "status": "static" }, "modprobe@configfs.service": { "name": "modprobe@configfs.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@drm.service": { "name": "modprobe@drm.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@efi_pstore.service": { "name": "modprobe@efi_pstore.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "modprobe@fuse.service": { "name": "modprobe@fuse.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "multipathd.service": { "name": "multipathd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "ndctl-monitor.service": { "name": "ndctl-monitor.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfsdcld.service": { "name": "nfsdcld.service", "source": "systemd", "state": "stopped", "status": "static" }, "nftables.service": { "name": "nftables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nis-domainname.service": { "name": "nis-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "nm-priv-helper.service": { "name": "nm-priv-helper.service", "source": "systemd", "state": "inactive", "status": "static" }, "ntpd.service": { "name": "ntpd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ntpdate.service": { "name": "ntpdate.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "oddjobd.service": { "name": "oddjobd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "pam_namespace.service": { "name": "pam_namespace.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "raid-check.service": { "name": "raid-check.service", "source": "systemd", "state": "stopped", "status": "static" }, "rbdmap.service": { "name": "rbdmap.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-svcgssd.service": { "name": "rpc-svcgssd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpmdb-rebuild.service": { "name": "rpmdb-rebuild.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-autorelabel-mark.service": { "name": "selinux-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "selinux-autorelabel.service": { "name": "selinux-autorelabel.service", "source": "systemd", "state": "inactive", "status": "static" }, "selinux-check-proper-disable.service": { "name": "selinux-check-proper-disable.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "indirect" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "active" }, "sntp.service": { "name": "sntp.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "sshd-keygen@.service": { "name": "sshd-keygen@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "sshd-keygen@ecdsa.service": { "name": "sshd-keygen@ecdsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@ed25519.service": { "name": "sshd-keygen@ed25519.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd-keygen@rsa.service": { "name": "sshd-keygen@rsa.service", "source": "systemd", "state": "stopped", "status": "inactive" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "sssd-autofs.service": { "name": "sssd-autofs.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-kcm.service": { "name": "sssd-kcm.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "sssd-nss.service": { "name": "sssd-nss.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pac.service": { "name": "sssd-pac.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-pam.service": { "name": "sssd-pam.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-ssh.service": { "name": "sssd-ssh.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd-sudo.service": { "name": "sssd-sudo.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "sssd.service": { "name": "sssd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "stratis-fstab-setup@.service": { "name": "stratis-fstab-setup@.service", "source": "systemd", "state": "unknown", "status": "static" }, "stratisd-min-postinitrd.service": { "name": "stratisd-min-postinitrd.service", "source": "systemd", "state": "inactive", "status": "static" }, "stratisd.service": { "name": "stratisd.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "syslog.service": { "name": "syslog.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "system-update-cleanup.service": { "name": "system-update-cleanup.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bless-boot.service": { "name": "systemd-bless-boot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-boot-check-no-failures.service": { "name": "systemd-boot-check-no-failures.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-boot-random-seed.service": { "name": "systemd-boot-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-boot-update.service": { "name": "systemd-boot-update.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-coredump@.service": { "name": "systemd-coredump@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-exit.service": { "name": "systemd-exit.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-growfs-root.service": { "name": "systemd-growfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-growfs@.service": { "name": "systemd-growfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-journald@.service": { "name": "systemd-journald@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-network-generator.service": { "name": "systemd-network-generator.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-networkd-wait-online.service": { "name": "systemd-networkd-wait-online.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-pcrfs-root.service": { "name": "systemd-pcrfs-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pcrfs@.service": { "name": "systemd-pcrfs@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-pcrmachine.service": { "name": "systemd-pcrmachine.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-initrd.service": { "name": "systemd-pcrphase-initrd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase-sysinit.service": { "name": "systemd-pcrphase-sysinit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-pcrphase.service": { "name": "systemd-pcrphase.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-pstore.service": { "name": "systemd-pstore.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "enabled-runtime" }, "systemd-repart.service": { "name": "systemd-repart.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill.service": { "name": "systemd-rfkill.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend-then-hibernate.service": { "name": "systemd-suspend-then-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-sysext.service": { "name": "systemd-sysext.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "systemd-sysupdate-reboot.service": { "name": "systemd-sysupdate-reboot.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysupdate.service": { "name": "systemd-sysupdate.service", "source": "systemd", "state": "inactive", "status": "indirect" }, "systemd-sysusers.service": { "name": "systemd-sysusers.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-timesyncd.service": { "name": "systemd-timesyncd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles.service": { "name": "systemd-tmpfiles.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-volatile-root.service": { "name": "systemd-volatile-root.service", "source": "systemd", "state": "inactive", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@.service": { "name": "user-runtime-dir@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user-runtime-dir@0.service": { "name": "user-runtime-dir@0.service", "source": "systemd", "state": "stopped", "status": "active" }, "user@.service": { "name": "user@.service", "source": "systemd", "state": "unknown", "status": "static" }, "user@0.service": { "name": "user@0.service", "source": "systemd", "state": "running", "status": "active" }, "ypbind.service": { "name": "ypbind.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "yppasswdd.service": { "name": "yppasswdd.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypserv.service": { "name": "ypserv.service", "source": "systemd", "state": "stopped", "status": "not-found" }, "ypxfrd.service": { "name": "ypxfrd.service", "source": "systemd", "state": "stopped", "status": "not-found" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:82 Thursday 16 April 2026 19:44:46 -0400 (0:00:04.997) 0:22:22.693 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Thursday 16 April 2026 19:44:47 -0400 (0:00:00.405) 0:22:23.099 ******** changed: [managed-node13] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=CeOF2F-JH5x-FPcS-IIeE-QvP6-4d1m-bl1dga", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:103 Thursday 16 April 2026 19:44:50 -0400 (0:00:02.959) 0:22:26.058 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_udevadm_trigger | d(false)", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:110 Thursday 16 April 2026 19:44:50 -0400 (0:00:00.252) 0:22:26.311 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382984.6629415, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "3c67f73abf8fc34648cd8b6b9db3d87e3750ec6f", "ctime": 1776382984.6599414, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 218104009, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1776382984.6599414, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1490, "uid": 0, "version": "292099060", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:115 Thursday 16 April 2026 19:44:51 -0400 (0:00:01.256) 0:22:27.568 ******** ok: [managed-node13] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:133 Thursday 16 April 2026 19:44:52 -0400 (0:00:01.259) 0:22:28.828 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:139 Thursday 16 April 2026 19:44:53 -0400 (0:00:00.528) 0:22:29.356 ******** ok: [managed-node13] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sda", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "state": "absent" } ], "packages": [ "xfsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=CeOF2F-JH5x-FPcS-IIeE-QvP6-4d1m-bl1dga", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Thursday 16 April 2026 19:44:53 -0400 (0:00:00.337) 0:22:29.694 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:152 Thursday 16 April 2026 19:44:53 -0400 (0:00:00.281) 0:22:29.975 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=CeOF2F-JH5x-FPcS-IIeE-QvP6-4d1m-bl1dga", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:168 Thursday 16 April 2026 19:44:54 -0400 (0:00:00.374) 0:22:30.350 ******** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount changed: [managed-node13] => (item={'src': '/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601', 'path': '/opt/test1', 'state': 'absent', 'fstype': 'xfs'}) => { "ansible_loop_var": "mount_info", "backup_file": "", "boot": "yes", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-86f9e8c5-912e-4eca-9538-fec63ecd5601" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 Thursday 16 April 2026 19:44:55 -0400 (0:00:01.331) 0:22:31.682 ******** ok: [managed-node13] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184 Thursday 16 April 2026 19:44:57 -0400 (0:00:01.680) 0:22:33.362 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Thursday 16 April 2026 19:44:57 -0400 (0:00:00.293) 0:22:33.655 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:207 Thursday 16 April 2026 19:44:57 -0400 (0:00:00.369) 0:22:34.025 ******** ok: [managed-node13] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:215 Thursday 16 April 2026 19:44:59 -0400 (0:00:01.637) 0:22:35.663 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776382996.8359444, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "f796d477950ccd8e38f9af9d7d8b68d2927b947a", "ctime": 1776382989.7749426, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 192938179, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1776382989.7755983, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "1626794079", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:220 Thursday 16 April 2026 19:45:00 -0400 (0:00:01.239) 0:22:36.902 ******** changed: [managed-node13] => (item={'backing_device': '/dev/mapper/foo-test1', 'name': 'luks-86f9e8c5-912e-4eca-9538-fec63ecd5601', 'password': '-', 'state': 'absent'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-86f9e8c5-912e-4eca-9538-fec63ecd5601", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:242 Thursday 16 April 2026 19:45:02 -0400 (0:00:01.337) 0:22:38.241 ******** ok: [managed-node13] TASK [Verify role results - 11] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:511 Thursday 16 April 2026 19:45:04 -0400 (0:00:01.995) 0:22:40.236 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node13 TASK [Print out pool information] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Thursday 16 April 2026 19:45:05 -0400 (0:00:00.803) 0:22:41.040 ******** skipping: [managed-node13] => { "false_condition": "_storage_pools_list | length > 0" } TASK [Print out volume information] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Thursday 16 April 2026 19:45:05 -0400 (0:00:00.256) 0:22:41.297 ******** ok: [managed-node13] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=CeOF2F-JH5x-FPcS-IIeE-QvP6-4d1m-bl1dga", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Thursday 16 April 2026 19:45:05 -0400 (0:00:00.349) 0:22:41.647 ******** ok: [managed-node13] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "xfs", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "fb6c01b6-de24-4b5b-a70c-3231ceae3fbd" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Thursday 16 April 2026 19:45:06 -0400 (0:00:01.182) 0:22:42.850 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002893", "end": "2026-04-16 19:45:07.882837", "rc": 0, "start": "2026-04-16 19:45:07.879944" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Tue Apr 14 06:59:53 2026 # # Accessible filesystems, by reference, are maintained under '/dev/disk/'. # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info. # # After editing this file, run 'systemctl daemon-reload' to update systemd # units generated from this file. # UUID=fb6c01b6-de24-4b5b-a70c-3231ceae3fbd / xfs defaults 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_engineering_sm/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 ntap-rdu2-c01-eng01-nfs01b.storage.rdu2.redhat.com:/bos_eng01_devops_brew_scratch_nfs_sm/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr,noauto 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Thursday 16 April 2026 19:45:08 -0400 (0:00:01.245) 0:22:44.096 ******** ok: [managed-node13] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002961", "end": "2026-04-16 19:45:09.125402", "failed_when_result": false, "rc": 0, "start": "2026-04-16 19:45:09.122441" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Thursday 16 April 2026 19:45:09 -0400 (0:00:01.227) 0:22:45.323 ******** skipping: [managed-node13] => { "changed": false, "skipped_reason": "No items in the list" } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:43 Thursday 16 April 2026 19:45:09 -0400 (0:00:00.201) 0:22:45.525 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node13 => (item={'encryption': False, 'encryption_cipher': None, 'encryption_key': None, 'encryption_key_size': None, 'encryption_luks_version': None, 'encryption_password': None, 'fs_create_options': '', 'fs_label': '', 'fs_type': 'lvmpv', 'mount_options': 'defaults', 'mount_point': None, 'mount_user': None, 'mount_group': None, 'mount_mode': None, 'name': 'foo', 'raid_level': None, 'size': 10737418240, 'state': 'absent', 'type': 'disk', 'disks': ['sda'], 'raid_device_count': None, 'raid_spare_count': None, 'raid_metadata_version': None, 'raid_chunk_size': None, 'fs_overwrite_existing': True, 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid', 'raid_stripe_size': None, 'compression': None, 'deduplication': None, 'vdo_pool_size': None, 'thin': None, 'thin_pool_name': None, 'thin_pool_size': None, 'cached': False, 'cache_size': 0, 'cache_mode': None, 'cache_devices': [], '_device': '/dev/sda', '_raw_device': '/dev/sda', '_mount_id': 'UUID=CeOF2F-JH5x-FPcS-IIeE-QvP6-4d1m-bl1dga'}) TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Thursday 16 April 2026 19:45:10 -0400 (0:00:00.506) 0:22:46.031 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for storage_test_volume_subset] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Thursday 16 April 2026 19:45:10 -0400 (0:00:00.258) 0:22:46.290 ******** included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node13 => (item=mount) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node13 => (item=fstab) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node13 => (item=fs) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node13 => (item=device) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node13 => (item=encryption) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node13 => (item=md) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node13 => (item=size) included: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node13 => (item=cache) TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Thursday 16 April 2026 19:45:11 -0400 (0:00:01.449) 0:22:47.740 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Thursday 16 April 2026 19:45:12 -0400 (0:00:00.399) 0:22:48.139 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Thursday 16 April 2026 19:45:12 -0400 (0:00:00.410) 0:22:48.549 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:32 Thursday 16 April 2026 19:45:12 -0400 (0:00:00.190) 0:22:48.740 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:40 Thursday 16 April 2026 19:45:12 -0400 (0:00:00.172) 0:22:48.912 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:51 Thursday 16 April 2026 19:45:13 -0400 (0:00:00.186) 0:22:49.098 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:62 Thursday 16 April 2026 19:45:13 -0400 (0:00:00.176) 0:22:49.275 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:76 Thursday 16 April 2026 19:45:13 -0400 (0:00:00.168) 0:22:49.443 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:82 Thursday 16 April 2026 19:45:13 -0400 (0:00:00.214) 0:22:49.657 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:88 Thursday 16 April 2026 19:45:13 -0400 (0:00:00.193) 0:22:49.851 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.fs_type == \"swap\"", "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:98 Thursday 16 April 2026 19:45:14 -0400 (0:00:00.209) 0:22:50.061 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Thursday 16 April 2026 19:45:14 -0400 (0:00:00.195) 0:22:50.256 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Thursday 16 April 2026 19:45:14 -0400 (0:00:00.626) 0:22:50.883 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Thursday 16 April 2026 19:45:15 -0400 (0:00:00.168) 0:22:51.051 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Thursday 16 April 2026 19:45:15 -0400 (0:00:00.292) 0:22:51.344 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "__storage_verify_mount_options | d(false)", "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Thursday 16 April 2026 19:45:15 -0400 (0:00:00.191) 0:22:51.551 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:52 Thursday 16 April 2026 19:45:15 -0400 (0:00:00.309) 0:22:51.860 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Thursday 16 April 2026 19:45:16 -0400 (0:00:00.260) 0:22:52.121 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Thursday 16 April 2026 19:45:16 -0400 (0:00:00.299) 0:22:52.420 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Thursday 16 April 2026 19:45:16 -0400 (0:00:00.237) 0:22:52.693 ******** ok: [managed-node13] => { "changed": false, "stat": { "atime": 1776383089.727968, "attr_flags": "", "attributes": [], "block_size": 512, "blocks": 0, "charset": "binary", "ctime": 1776383089.727968, "dev": 6, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 450, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1776383089.727968, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Thursday 16 April 2026 19:45:17 -0400 (0:00:01.233) 0:22:53.926 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node - 2] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Thursday 16 April 2026 19:45:18 -0400 (0:00:00.330) 0:22:54.256 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not (_storage_test_volume_present or storage_test_volume.type == 'disk')", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Thursday 16 April 2026 19:45:18 -0400 (0:00:00.256) 0:22:54.513 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Thursday 16 April 2026 19:45:18 -0400 (0:00:00.209) 0:22:54.722 ******** ok: [managed-node13] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Thursday 16 April 2026 19:45:18 -0400 (0:00:00.271) 0:22:54.994 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == \"raid\"", "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Thursday 16 April 2026 19:45:19 -0400 (0:00:00.231) 0:22:55.225 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Thursday 16 April 2026 19:45:19 -0400 (0:00:00.173) 0:22:55.398 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Thursday 16 April 2026 19:45:19 -0400 (0:00:00.199) 0:22:55.598 ******** ok: [managed-node13] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Thursday 16 April 2026 19:45:21 -0400 (0:00:01.631) 0:22:57.230 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.encryption and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Thursday 16 April 2026 19:45:21 -0400 (0:00:00.167) 0:22:57.397 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Thursday 16 April 2026 19:45:21 -0400 (0:00:00.259) 0:22:57.657 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Thursday 16 April 2026 19:45:21 -0400 (0:00:00.152) 0:22:57.809 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Thursday 16 April 2026 19:45:21 -0400 (0:00:00.184) 0:22:57.993 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present and storage_test_volume.encryption", "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Thursday 16 April 2026 19:45:22 -0400 (0:00:00.145) 0:22:58.139 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:64 Thursday 16 April 2026 19:45:22 -0400 (0:00:00.173) 0:22:58.313 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:77 Thursday 16 April 2026 19:45:22 -0400 (0:00:00.171) 0:22:58.484 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:90 Thursday 16 April 2026 19:45:22 -0400 (0:00:00.242) 0:22:58.727 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:96 Thursday 16 April 2026 19:45:23 -0400 (0:00:00.375) 0:22:59.102 ******** ok: [managed-node13] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:103 Thursday 16 April 2026 19:45:23 -0400 (0:00:00.281) 0:22:59.384 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:111 Thursday 16 April 2026 19:45:23 -0400 (0:00:00.243) 0:22:59.628 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:119 Thursday 16 April 2026 19:45:23 -0400 (0:00:00.203) 0:22:59.831 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_expected_crypttab_entries | int == 1", "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:127 Thursday 16 April 2026 19:45:24 -0400 (0:00:00.247) 0:23:00.078 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Thursday 16 April 2026 19:45:24 -0400 (0:00:00.233) 0:23:00.311 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Thursday 16 April 2026 19:45:24 -0400 (0:00:00.199) 0:23:00.511 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Thursday 16 April 2026 19:45:24 -0400 (0:00:00.223) 0:23:00.734 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Thursday 16 April 2026 19:45:24 -0400 (0:00:00.207) 0:23:00.942 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Thursday 16 April 2026 19:45:25 -0400 (0:00:00.226) 0:23:01.169 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Thursday 16 April 2026 19:45:25 -0400 (0:00:00.189) 0:23:01.358 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Thursday 16 April 2026 19:45:25 -0400 (0:00:00.261) 0:23:01.620 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Thursday 16 April 2026 19:45:25 -0400 (0:00:00.216) 0:23:01.836 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Thursday 16 April 2026 19:45:26 -0400 (0:00:00.223) 0:23:02.059 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Thursday 16 April 2026 19:45:26 -0400 (0:00:00.246) 0:23:02.306 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'raid'", "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Thursday 16 April 2026 19:45:26 -0400 (0:00:00.231) 0:23:02.538 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Thursday 16 April 2026 19:45:26 -0400 (0:00:00.209) 0:23:02.747 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Thursday 16 April 2026 19:45:26 -0400 (0:00:00.195) 0:23:02.943 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Thursday 16 April 2026 19:45:27 -0400 (0:00:00.160) 0:23:03.103 ******** ok: [managed-node13] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Thursday 16 April 2026 19:45:27 -0400 (0:00:00.219) 0:23:03.322 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Thursday 16 April 2026 19:45:27 -0400 (0:00:00.124) 0:23:03.447 ******** skipping: [managed-node13] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Thursday 16 April 2026 19:45:27 -0400 (0:00:00.194) 0:23:03.642 ******** skipping: [managed-node13] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Show test pool size] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Thursday 16 April 2026 19:45:27 -0400 (0:00:00.155) 0:23:03.797 ******** skipping: [managed-node13] => { "false_condition": "_storage_test_volume_present | bool" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Thursday 16 April 2026 19:45:27 -0400 (0:00:00.189) 0:23:03.987 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:68 Thursday 16 April 2026 19:45:28 -0400 (0:00:00.202) 0:23:04.190 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:72 Thursday 16 April 2026 19:45:28 -0400 (0:00:00.279) 0:23:04.469 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:77 Thursday 16 April 2026 19:45:28 -0400 (0:00:00.343) 0:23:04.812 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:83 Thursday 16 April 2026 19:45:28 -0400 (0:00:00.207) 0:23:05.020 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:87 Thursday 16 April 2026 19:45:29 -0400 (0:00:00.247) 0:23:05.268 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:92 Thursday 16 April 2026 19:45:29 -0400 (0:00:00.273) 0:23:05.541 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:97 Thursday 16 April 2026 19:45:29 -0400 (0:00:00.218) 0:23:05.760 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:102 Thursday 16 April 2026 19:45:30 -0400 (0:00:00.278) 0:23:06.038 ******** skipping: [managed-node13] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:106 Thursday 16 April 2026 19:45:30 -0400 (0:00:00.262) 0:23:06.300 ******** skipping: [managed-node13] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Show test volume size] *************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:110 Thursday 16 April 2026 19:45:30 -0400 (0:00:00.203) 0:23:06.504 ******** skipping: [managed-node13] => { "false_condition": "not storage_test_volume.thin is none" } TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:114 Thursday 16 April 2026 19:45:30 -0400 (0:00:00.203) 0:23:06.708 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value - 2] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:121 Thursday 16 April 2026 19:45:30 -0400 (0:00:00.262) 0:23:06.970 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:128 Thursday 16 April 2026 19:45:31 -0400 (0:00:00.235) 0:23:07.206 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:132 Thursday 16 April 2026 19:45:31 -0400 (0:00:00.185) 0:23:07.391 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:138 Thursday 16 April 2026 19:45:31 -0400 (0:00:00.258) 0:23:07.649 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "not storage_test_volume.thin is none", "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:144 Thursday 16 April 2026 19:45:31 -0400 (0:00:00.248) 0:23:07.897 ******** ok: [managed-node13] => { "storage_test_actual_size": { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size - 2] ************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:148 Thursday 16 April 2026 19:45:32 -0400 (0:00:00.197) 0:23:08.095 ******** ok: [managed-node13] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:152 Thursday 16 April 2026 19:45:32 -0400 (0:00:00.229) 0:23:08.324 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "_storage_test_volume_present | bool", "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Thursday 16 April 2026 19:45:32 -0400 (0:00:00.207) 0:23:08.532 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Thursday 16 April 2026 19:45:32 -0400 (0:00:00.194) 0:23:08.726 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Thursday 16 April 2026 19:45:32 -0400 (0:00:00.157) 0:23:08.883 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Thursday 16 April 2026 19:45:33 -0400 (0:00:00.223) 0:23:09.106 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Thursday 16 April 2026 19:45:33 -0400 (0:00:00.195) 0:23:09.302 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Thursday 16 April 2026 19:45:33 -0400 (0:00:00.220) 0:23:09.523 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Thursday 16 April 2026 19:45:33 -0400 (0:00:00.182) 0:23:09.706 ******** skipping: [managed-node13] => { "changed": false, "false_condition": "storage_test_volume.type == 'lvm' and _storage_test_volume_present", "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Thursday 16 April 2026 19:45:33 -0400 (0:00:00.161) 0:23:09.867 ******** ok: [managed-node13] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:52 Thursday 16 April 2026 19:45:33 -0400 (0:00:00.153) 0:23:10.020 ******** ok: [managed-node13] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } PLAY RECAP ********************************************************************* managed-node13 : ok=1287 changed=60 unreachable=0 failed=0 skipped=1115 rescued=18 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.17.14", "end_time": "2026-04-16T23:23:36.161373+00:00Z", "host": "managed-node13", "message": "encrypted volume 'foo' missing key/password", "start_time": "2026-04-16T23:23:34.381441+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.17.14", "end_time": "2026-04-16T23:23:36.572524+00:00Z", "host": "managed-node13", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'foo' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-16T23:23:36.244424+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.17.14", "end_time": "2026-04-16T23:25:34.957584+00:00Z", "host": "managed-node13", "message": "cannot remove existing formatting on device 'luks-081a4f92-2987-47ea-b6a5-2bc265a88537' in safe mode due to encryption removal", "start_time": "2026-04-16T23:25:32.816751+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.17.14", "end_time": "2026-04-16T23:25:35.221928+00:00Z", "host": "managed-node13", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10720641024, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-081a4f92-2987-47ea-b6a5-2bc265a88537' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-16T23:25:35.004594+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.17.14", "end_time": "2026-04-16T23:27:10.464357+00:00Z", "host": "managed-node13", "message": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "start_time": "2026-04-16T23:27:08.438902+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.17.14", "end_time": "2026-04-16T23:27:10.676665+00:00Z", "host": "managed-node13", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-16T23:27:10.509446+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.17.14", "end_time": "2026-04-16T23:28:57.490952+00:00Z", "host": "managed-node13", "message": "encrypted volume 'test1' missing key/password", "start_time": "2026-04-16T23:28:55.486919+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.17.14", "end_time": "2026-04-16T23:28:57.795725+00:00Z", "host": "managed-node13", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-16T23:28:57.523434+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.17.14", "end_time": "2026-04-16T23:31:08.239979+00:00Z", "host": "managed-node13", "message": "cannot remove existing formatting on device 'luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02' in safe mode due to encryption removal", "start_time": "2026-04-16T23:31:06.105642+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.17.14", "end_time": "2026-04-16T23:31:08.574909+00:00Z", "host": "managed-node13", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks2", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-2faf63cf-a8c6-4dd2-b395-d9542a5f7a02' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-16T23:31:08.310429+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.17.14", "end_time": "2026-04-16T23:33:22.592829+00:00Z", "host": "managed-node13", "message": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "start_time": "2026-04-16T23:33:20.344832+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.17.14", "end_time": "2026-04-16T23:33:22.932523+00:00Z", "host": "managed-node13", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-16T23:33:22.640748+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.17.14", "end_time": "2026-04-16T23:35:38.093724+00:00Z", "host": "managed-node13", "message": "encrypted volume 'test1' missing key/password", "start_time": "2026-04-16T23:35:35.742887+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.17.14", "end_time": "2026-04-16T23:35:38.452221+00:00Z", "host": "managed-node13", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": false, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "encrypted volume 'test1' missing key/password", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-16T23:35:38.148249+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.17.14", "end_time": "2026-04-16T23:40:03.140632+00:00Z", "host": "managed-node13", "message": "cannot remove existing formatting on device 'luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479' in safe mode due to encryption removal", "start_time": "2026-04-16T23:40:00.685820+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.17.14", "end_time": "2026-04-16T23:40:03.373493+00:00Z", "host": "managed-node13", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'luks-1edafecf-4d8e-4f78-aff4-ede6c0f23479' in safe mode due to encryption removal", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-16T23:40:03.220407+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" }, { "ansible_version": "2.17.14", "end_time": "2026-04-16T23:42:22.437758+00:00Z", "host": "managed-node13", "message": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "start_time": "2026-04-16T23:42:20.433892+00:00Z", "task_name": "Manage the pools and volumes to match the specified state", "task_path": "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88" }, { "ansible_version": "2.17.14", "end_time": "2026-04-16T23:42:22.787088+00:00Z", "host": "managed-node13", "message": { "_ansible_no_log": false, "actions": [], "changed": false, "crypts": [], "failed": true, "invocation": { "module_args": { "disklabel_type": null, "diskvolume_mkfs_option_map": {}, "packages_only": false, "pool_defaults": { "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "grow_to_fill": false, "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [] }, "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "part_type": null, "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "safe_mode": true, "use_partitions": null, "uses_kmod_kvdo": true, "volume_defaults": { "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_options": "defaults", "mount_passno": 0, "mount_point": "", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 0, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null }, "volumes": [] } }, "leaves": [], "mounts": [], "msg": "cannot remove existing formatting on device 'test1' in safe mode due to adding encryption", "packages": [], "pools": [], "volumes": [] }, "start_time": "2026-04-16T23:42:22.513329+00:00Z", "task_name": "Failed message", "task_path": "/tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:129" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Thursday 16 April 2026 19:45:34 -0400 (0:00:00.183) 0:23:10.204 ******** =============================================================================== fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 13.40s /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 12.17s /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 11.94s /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 11.69s /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 11.32s /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 10.73s /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:88 Gathering Facts --------------------------------------------------------- 7.09s /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:2 fedora.linux_system_roles.storage : Get service facts ------------------- 6.56s /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab --- 5.99s /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:179 fedora.linux_system_roles.storage : Get service facts ------------------- 5.10s /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 fedora.linux_system_roles.storage : Get service facts ------------------- 5.00s /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 fedora.linux_system_roles.storage : Get service facts ------------------- 4.51s /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 fedora.linux_system_roles.storage : Make sure blivet is available ------- 4.20s /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:20 Parse the actual size of the volume ------------------------------------- 4.15s /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 fedora.linux_system_roles.storage : Get service facts ------------------- 4.15s /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 fedora.linux_system_roles.storage : Get service facts ------------------- 4.03s /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 fedora.linux_system_roles.storage : Get required packages --------------- 3.99s /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 fedora.linux_system_roles.storage : Get service facts ------------------- 3.87s /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:76 Write the key into the key file ----------------------------------------- 3.73s /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:313 fedora.linux_system_roles.storage : Set up new/current mounts ----------- 3.69s /tmp/collections-HLp/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:184